51243 1727204716.79898: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 51243 1727204716.80447: Added group all to inventory 51243 1727204716.80449: Added group ungrouped to inventory 51243 1727204716.80454: Group all now contains ungrouped 51243 1727204716.80457: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 51243 1727204717.03973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 51243 1727204717.04043: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 51243 1727204717.04073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 51243 1727204717.04140: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 51243 1727204717.04228: Loaded config def from plugin (inventory/script) 51243 1727204717.04231: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 51243 1727204717.04279: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 51243 1727204717.04390: Loaded config def from plugin (inventory/yaml) 51243 1727204717.04393: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 51243 1727204717.04500: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 51243 1727204717.05046: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 51243 1727204717.05057: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 51243 1727204717.05061: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 51243 1727204717.05069: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 51243 1727204717.05075: Loading data from /tmp/network-jrl/inventory-0Xx.yml 51243 1727204717.05155: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 51243 1727204717.05241: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 51243 1727204717.05292: Loading data from /tmp/network-jrl/inventory-0Xx.yml 51243 1727204717.05398: group all already in inventory 51243 1727204717.05405: set inventory_file for managed-node1 51243 1727204717.05410: set inventory_dir for managed-node1 51243 1727204717.05411: Added host managed-node1 to inventory 51243 1727204717.05414: Added host managed-node1 to group all 51243 1727204717.05415: set ansible_host for managed-node1 51243 1727204717.05416: set ansible_ssh_extra_args for managed-node1 51243 1727204717.05419: set inventory_file for managed-node2 51243 1727204717.05423: set inventory_dir for managed-node2 51243 1727204717.05424: Added host managed-node2 to inventory 51243 1727204717.05425: Added host managed-node2 to group all 51243 1727204717.05426: set ansible_host for managed-node2 51243 1727204717.05427: set ansible_ssh_extra_args for managed-node2 51243 1727204717.05430: set inventory_file for managed-node3 51243 1727204717.05436: set inventory_dir for managed-node3 51243 1727204717.05437: Added host managed-node3 to inventory 51243 1727204717.05438: Added host managed-node3 to group all 51243 1727204717.05439: set ansible_host for managed-node3 51243 1727204717.05440: set ansible_ssh_extra_args for managed-node3 51243 1727204717.05443: Reconcile groups and hosts in inventory. 51243 1727204717.05447: Group ungrouped now contains managed-node1 51243 1727204717.05449: Group ungrouped now contains managed-node2 51243 1727204717.05451: Group ungrouped now contains managed-node3 51243 1727204717.05550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 51243 1727204717.05713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 51243 1727204717.05775: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 51243 1727204717.05806: Loaded config def from plugin (vars/host_group_vars) 51243 1727204717.05809: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 51243 1727204717.05823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 51243 1727204717.05836: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 51243 1727204717.05886: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 51243 1727204717.06284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204717.06396: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 51243 1727204717.06445: Loaded config def from plugin (connection/local) 51243 1727204717.06448: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 51243 1727204717.07296: Loaded config def from plugin (connection/paramiko_ssh) 51243 1727204717.07300: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 51243 1727204717.08456: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 51243 1727204717.08506: Loaded config def from plugin (connection/psrp) 51243 1727204717.08510: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 51243 1727204717.09430: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 51243 1727204717.09480: Loaded config def from plugin (connection/ssh) 51243 1727204717.09484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 51243 1727204717.12024: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 51243 1727204717.12078: Loaded config def from plugin (connection/winrm) 51243 1727204717.12082: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 51243 1727204717.12130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 51243 1727204717.12207: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 51243 1727204717.12291: Loaded config def from plugin (shell/cmd) 51243 1727204717.12294: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 51243 1727204717.12323: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 51243 1727204717.12404: Loaded config def from plugin (shell/powershell) 51243 1727204717.12407: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 51243 1727204717.12476: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 51243 1727204717.12700: Loaded config def from plugin (shell/sh) 51243 1727204717.12704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 51243 1727204717.12743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 51243 1727204717.12900: Loaded config def from plugin (become/runas) 51243 1727204717.12903: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 51243 1727204717.13135: Loaded config def from plugin (become/su) 51243 1727204717.13138: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 51243 1727204717.13337: Loaded config def from plugin (become/sudo) 51243 1727204717.13339: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 51243 1727204717.13383: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 51243 1727204717.13796: in VariableManager get_vars() 51243 1727204717.13822: done with get_vars() 51243 1727204717.13993: trying /usr/local/lib/python3.12/site-packages/ansible/modules 51243 1727204717.17570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 51243 1727204717.17717: in VariableManager get_vars() 51243 1727204717.17723: done with get_vars() 51243 1727204717.17726: variable 'playbook_dir' from source: magic vars 51243 1727204717.17727: variable 'ansible_playbook_python' from source: magic vars 51243 1727204717.17728: variable 'ansible_config_file' from source: magic vars 51243 1727204717.17729: variable 'groups' from source: magic vars 51243 1727204717.17729: variable 'omit' from source: magic vars 51243 1727204717.17730: variable 'ansible_version' from source: magic vars 51243 1727204717.17731: variable 'ansible_check_mode' from source: magic vars 51243 1727204717.17732: variable 'ansible_diff_mode' from source: magic vars 51243 1727204717.17735: variable 'ansible_forks' from source: magic vars 51243 1727204717.17736: variable 'ansible_inventory_sources' from source: magic vars 51243 1727204717.17737: variable 'ansible_skip_tags' from source: magic vars 51243 1727204717.17738: variable 'ansible_limit' from source: magic vars 51243 1727204717.17739: variable 'ansible_run_tags' from source: magic vars 51243 1727204717.17740: variable 'ansible_verbosity' from source: magic vars 51243 1727204717.17795: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml 51243 1727204717.18572: in VariableManager get_vars() 51243 1727204717.18592: done with get_vars() 51243 1727204717.18762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 51243 1727204717.19005: in VariableManager get_vars() 51243 1727204717.19023: done with get_vars() 51243 1727204717.19028: variable 'omit' from source: magic vars 51243 1727204717.19052: variable 'omit' from source: magic vars 51243 1727204717.19101: in VariableManager get_vars() 51243 1727204717.19114: done with get_vars() 51243 1727204717.19172: in VariableManager get_vars() 51243 1727204717.19196: done with get_vars() 51243 1727204717.19242: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 51243 1727204717.19540: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 51243 1727204717.19705: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 51243 1727204717.20546: in VariableManager get_vars() 51243 1727204717.20575: done with get_vars() 51243 1727204717.21094: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 51243 1727204717.21275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51243 1727204717.23307: in VariableManager get_vars() 51243 1727204717.23346: done with get_vars() 51243 1727204717.23354: variable 'omit' from source: magic vars 51243 1727204717.23368: variable 'omit' from source: magic vars 51243 1727204717.23405: in VariableManager get_vars() 51243 1727204717.23451: done with get_vars() 51243 1727204717.23478: in VariableManager get_vars() 51243 1727204717.23495: done with get_vars() 51243 1727204717.23529: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 51243 1727204717.23686: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 51243 1727204717.23785: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 51243 1727204717.26862: in VariableManager get_vars() 51243 1727204717.26893: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51243 1727204717.29415: in VariableManager get_vars() 51243 1727204717.29449: done with get_vars() 51243 1727204717.29457: variable 'omit' from source: magic vars 51243 1727204717.29471: variable 'omit' from source: magic vars 51243 1727204717.29504: in VariableManager get_vars() 51243 1727204717.29528: done with get_vars() 51243 1727204717.29556: in VariableManager get_vars() 51243 1727204717.29577: done with get_vars() 51243 1727204717.29607: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 51243 1727204717.29759: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 51243 1727204717.29841: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 51243 1727204717.30363: in VariableManager get_vars() 51243 1727204717.30403: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51243 1727204717.32824: in VariableManager get_vars() 51243 1727204717.32856: done with get_vars() 51243 1727204717.32862: variable 'omit' from source: magic vars 51243 1727204717.32892: variable 'omit' from source: magic vars 51243 1727204717.32945: in VariableManager get_vars() 51243 1727204717.32970: done with get_vars() 51243 1727204717.32993: in VariableManager get_vars() 51243 1727204717.33015: done with get_vars() 51243 1727204717.33059: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 51243 1727204717.33223: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 51243 1727204717.33326: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 51243 1727204717.33837: in VariableManager get_vars() 51243 1727204717.33870: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51243 1727204717.36199: in VariableManager get_vars() 51243 1727204717.36240: done with get_vars() 51243 1727204717.36284: in VariableManager get_vars() 51243 1727204717.36317: done with get_vars() 51243 1727204717.36392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 51243 1727204717.36417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 51243 1727204717.36741: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 51243 1727204717.36942: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 51243 1727204717.36945: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 51243 1727204717.36992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 51243 1727204717.37020: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 51243 1727204717.37226: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 51243 1727204717.37307: Loaded config def from plugin (callback/default) 51243 1727204717.37311: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 51243 1727204717.38876: Loaded config def from plugin (callback/junit) 51243 1727204717.38880: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 51243 1727204717.38947: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 51243 1727204717.39029: Loaded config def from plugin (callback/minimal) 51243 1727204717.39035: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 51243 1727204717.39093: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 51243 1727204717.39168: Loaded config def from plugin (callback/tree) 51243 1727204717.39171: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 51243 1727204717.39320: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 51243 1727204717.39323: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wireless_nm.yml ************************************************ 2 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 51243 1727204717.39356: in VariableManager get_vars() 51243 1727204717.39383: done with get_vars() 51243 1727204717.39391: in VariableManager get_vars() 51243 1727204717.39400: done with get_vars() 51243 1727204717.39409: variable 'omit' from source: magic vars 51243 1727204717.39454: in VariableManager get_vars() 51243 1727204717.39472: done with get_vars() 51243 1727204717.39500: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_wireless.yml' with nm as provider] ********* 51243 1727204717.40153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 51243 1727204717.40242: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 51243 1727204717.40280: getting the remaining hosts for this loop 51243 1727204717.40282: done getting the remaining hosts for this loop 51243 1727204717.40285: getting the next task for host managed-node3 51243 1727204717.40290: done getting next task for host managed-node3 51243 1727204717.40291: ^ task is: TASK: Gathering Facts 51243 1727204717.40293: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204717.40296: getting variables 51243 1727204717.40297: in VariableManager get_vars() 51243 1727204717.40307: Calling all_inventory to load vars for managed-node3 51243 1727204717.40310: Calling groups_inventory to load vars for managed-node3 51243 1727204717.40312: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204717.40326: Calling all_plugins_play to load vars for managed-node3 51243 1727204717.40338: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204717.40341: Calling groups_plugins_play to load vars for managed-node3 51243 1727204717.40381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204717.40440: done with get_vars() 51243 1727204717.40447: done getting variables 51243 1727204717.40537: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Tuesday 24 September 2024 15:05:17 -0400 (0:00:00.013) 0:00:00.013 ***** 51243 1727204717.40563: entering _queue_task() for managed-node3/gather_facts 51243 1727204717.40567: Creating lock for gather_facts 51243 1727204717.40944: worker is 1 (out of 1 available) 51243 1727204717.40958: exiting _queue_task() for managed-node3/gather_facts 51243 1727204717.40975: done queuing things up, now waiting for results queue to drain 51243 1727204717.40977: waiting for pending results... 51243 1727204717.41284: running TaskExecutor() for managed-node3/TASK: Gathering Facts 51243 1727204717.41325: in run() - task 127b8e07-fff9-5c5d-847b-000000000147 51243 1727204717.41349: variable 'ansible_search_path' from source: unknown 51243 1727204717.41405: calling self._execute() 51243 1727204717.41510: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204717.41514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204717.41520: variable 'omit' from source: magic vars 51243 1727204717.41653: variable 'omit' from source: magic vars 51243 1727204717.41702: variable 'omit' from source: magic vars 51243 1727204717.41770: variable 'omit' from source: magic vars 51243 1727204717.41801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51243 1727204717.41860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51243 1727204717.41887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51243 1727204717.41909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204717.41942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204717.42029: variable 'inventory_hostname' from source: host vars for 'managed-node3' 51243 1727204717.42038: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204717.42042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204717.42121: Set connection var ansible_shell_type to sh 51243 1727204717.42150: Set connection var ansible_module_compression to ZIP_DEFLATED 51243 1727204717.42161: Set connection var ansible_connection to ssh 51243 1727204717.42175: Set connection var ansible_pipelining to False 51243 1727204717.42184: Set connection var ansible_shell_executable to /bin/sh 51243 1727204717.42193: Set connection var ansible_timeout to 10 51243 1727204717.42219: variable 'ansible_shell_executable' from source: unknown 51243 1727204717.42226: variable 'ansible_connection' from source: unknown 51243 1727204717.42246: variable 'ansible_module_compression' from source: unknown 51243 1727204717.42249: variable 'ansible_shell_type' from source: unknown 51243 1727204717.42255: variable 'ansible_shell_executable' from source: unknown 51243 1727204717.42267: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204717.42357: variable 'ansible_pipelining' from source: unknown 51243 1727204717.42361: variable 'ansible_timeout' from source: unknown 51243 1727204717.42364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204717.42513: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51243 1727204717.42530: variable 'omit' from source: magic vars 51243 1727204717.42541: starting attempt loop 51243 1727204717.42548: running the handler 51243 1727204717.42572: variable 'ansible_facts' from source: unknown 51243 1727204717.42670: _low_level_execute_command(): starting 51243 1727204717.42674: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51243 1727204717.43579: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204717.43612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204717.43721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204717.45588: stdout chunk (state=3): >>>/root <<< 51243 1727204717.45807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204717.45811: stdout chunk (state=3): >>><<< 51243 1727204717.45814: stderr chunk (state=3): >>><<< 51243 1727204717.45944: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51243 1727204717.45948: _low_level_execute_command(): starting 51243 1727204717.45952: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012 `" && echo ansible-tmp-1727204717.4584084-51322-32663684843012="` echo /root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012 `" ) && sleep 0' 51243 1727204717.46588: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204717.46636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204717.46653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204717.46682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204717.46791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204717.48978: stdout chunk (state=3): >>>ansible-tmp-1727204717.4584084-51322-32663684843012=/root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012 <<< 51243 1727204717.49194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204717.49198: stdout chunk (state=3): >>><<< 51243 1727204717.49200: stderr chunk (state=3): >>><<< 51243 1727204717.49372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204717.4584084-51322-32663684843012=/root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51243 1727204717.49376: variable 'ansible_module_compression' from source: unknown 51243 1727204717.49379: ANSIBALLZ: Using generic lock for ansible.legacy.setup 51243 1727204717.49381: ANSIBALLZ: Acquiring lock 51243 1727204717.49383: ANSIBALLZ: Lock acquired: 139884892115344 51243 1727204717.49386: ANSIBALLZ: Creating module 51243 1727204717.84882: ANSIBALLZ: Writing module into payload 51243 1727204717.85086: ANSIBALLZ: Writing module 51243 1727204717.85126: ANSIBALLZ: Renaming module 51243 1727204717.85142: ANSIBALLZ: Done creating module 51243 1727204717.85195: variable 'ansible_facts' from source: unknown 51243 1727204717.85212: variable 'inventory_hostname' from source: host vars for 'managed-node3' 51243 1727204717.85228: _low_level_execute_command(): starting 51243 1727204717.85241: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 51243 1727204717.85989: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51243 1727204717.86044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204717.86068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51243 1727204717.86139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204717.86177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204717.86201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204717.86316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204717.88172: stdout chunk (state=3): >>>PLATFORM <<< 51243 1727204717.88248: stdout chunk (state=3): >>>Linux <<< 51243 1727204717.88277: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 51243 1727204717.88508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204717.88526: stderr chunk (state=3): >>><<< 51243 1727204717.88536: stdout chunk (state=3): >>><<< 51243 1727204717.88702: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51243 1727204717.88708 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 51243 1727204717.88712: _low_level_execute_command(): starting 51243 1727204717.88715: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 51243 1727204717.88788: Sending initial data 51243 1727204717.88798: Sent initial data (1181 bytes) 51243 1727204717.89801: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204717.89829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204717.89856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204717.89972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204717.93871: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 51243 1727204717.94579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204717.94583: stdout chunk (state=3): >>><<< 51243 1727204717.94586: stderr chunk (state=3): >>><<< 51243 1727204717.94588: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51243 1727204717.94590: variable 'ansible_facts' from source: unknown 51243 1727204717.94592: variable 'ansible_facts' from source: unknown 51243 1727204717.94606: variable 'ansible_module_compression' from source: unknown 51243 1727204717.94654: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51243vpkpdts3/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 51243 1727204717.94708: variable 'ansible_facts' from source: unknown 51243 1727204717.94955: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/AnsiballZ_setup.py 51243 1727204717.95269: Sending initial data 51243 1727204717.95273: Sent initial data (153 bytes) 51243 1727204717.95888: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204717.95913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204717.95932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204717.95958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204717.96070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204717.98260: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 51243 1727204717.98336: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 51243 1727204717.98411: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51243vpkpdts3/tmpx8kd15x8 /root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/AnsiballZ_setup.py <<< 51243 1727204717.98416: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/AnsiballZ_setup.py" <<< 51243 1727204717.98532: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-51243vpkpdts3/tmpx8kd15x8" to remote "/root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/AnsiballZ_setup.py" <<< 51243 1727204717.98540: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/AnsiballZ_setup.py" <<< 51243 1727204717.99911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204717.99973: stderr chunk (state=3): >>><<< 51243 1727204717.99977: stdout chunk (state=3): >>><<< 51243 1727204718.00002: done transferring module to remote 51243 1727204718.00108: _low_level_execute_command(): starting 51243 1727204718.00112: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/ /root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/AnsiballZ_setup.py && sleep 0' 51243 1727204718.00671: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204718.00677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 51243 1727204718.00685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 51243 1727204718.00688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204718.00736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204718.00743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204718.00822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 51243 1727204718.03583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204718.03709: stderr chunk (state=3): >>><<< 51243 1727204718.03713: stdout chunk (state=3): >>><<< 51243 1727204718.03721: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 51243 1727204718.03724: _low_level_execute_command(): starting 51243 1727204718.03728: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/AnsiballZ_setup.py && sleep 0' 51243 1727204718.04452: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51243 1727204718.04459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51243 1727204718.04462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204718.04465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51243 1727204718.04468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 51243 1727204718.04472: stderr chunk (state=3): >>>debug2: match not found <<< 51243 1727204718.04478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204718.04480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 51243 1727204718.04482: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 51243 1727204718.04484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 51243 1727204718.04492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51243 1727204718.04502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204718.04515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51243 1727204718.04523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 51243 1727204718.04530: stderr chunk (state=3): >>>debug2: match found <<< 51243 1727204718.04559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204718.04615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204718.04668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204718.04673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204718.04770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204718.07309: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 51243 1727204718.07320: stdout chunk (state=3): >>>import _imp # builtin <<< 51243 1727204718.07351: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 51243 1727204718.07414: stdout chunk (state=3): >>>import '_io' # <<< 51243 1727204718.07445: stdout chunk (state=3): >>>import 'marshal' # <<< 51243 1727204718.07468: stdout chunk (state=3): >>>import 'posix' # <<< 51243 1727204718.07499: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 51243 1727204718.07538: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 51243 1727204718.07544: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 51243 1727204718.07596: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.07624: stdout chunk (state=3): >>>import '_codecs' # <<< 51243 1727204718.07646: stdout chunk (state=3): >>>import 'codecs' # <<< 51243 1727204718.07686: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 51243 1727204718.07723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 51243 1727204718.07727: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18b18530> <<< 51243 1727204718.07780: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18ae7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 51243 1727204718.07784: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18b1aab0> <<< 51243 1727204718.07827: stdout chunk (state=3): >>>import '_signal' # <<< 51243 1727204718.07831: stdout chunk (state=3): >>>import '_abc' # <<< 51243 1727204718.07851: stdout chunk (state=3): >>>import 'abc' # <<< 51243 1727204718.07886: stdout chunk (state=3): >>>import 'io' # <<< 51243 1727204718.07892: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 51243 1727204718.07987: stdout chunk (state=3): >>>import '_collections_abc' # <<< 51243 1727204718.08018: stdout chunk (state=3): >>>import 'genericpath' # <<< 51243 1727204718.08045: stdout chunk (state=3): >>>import 'posixpath' # <<< 51243 1727204718.08097: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 51243 1727204718.08101: stdout chunk (state=3): >>>Processing user site-packages <<< 51243 1727204718.08125: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 51243 1727204718.08162: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 51243 1727204718.08197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 51243 1727204718.08200: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc188ed190> <<< 51243 1727204718.08284: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.08287: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc188ee090> <<< 51243 1727204718.08318: stdout chunk (state=3): >>>import 'site' # <<< 51243 1727204718.08355: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 51243 1727204718.08764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 51243 1727204718.08784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 51243 1727204718.08822: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.08846: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 51243 1727204718.08879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 51243 1727204718.08901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 51243 1727204718.08933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 51243 1727204718.08961: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1892be60> <<< 51243 1727204718.08984: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 51243 1727204718.09026: stdout chunk (state=3): >>>import '_operator' # <<< 51243 1727204718.09030: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1892bf20> <<< 51243 1727204718.09079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 51243 1727204718.09083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 51243 1727204718.09099: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 51243 1727204718.09171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.09200: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18963830> <<< 51243 1727204718.09234: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 51243 1727204718.09249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18963ec0> <<< 51243 1727204718.09273: stdout chunk (state=3): >>>import '_collections' # <<< 51243 1727204718.09316: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18943b30> <<< 51243 1727204718.09338: stdout chunk (state=3): >>>import '_functools' # <<< 51243 1727204718.09362: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18941250> <<< 51243 1727204718.09470: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18929010> <<< 51243 1727204718.09490: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 51243 1727204718.09530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 51243 1727204718.09555: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 51243 1727204718.09582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 51243 1727204718.09616: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 51243 1727204718.09659: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18987800> <<< 51243 1727204718.09662: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18986420> <<< 51243 1727204718.09700: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18942120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18984c50> <<< 51243 1727204718.09771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 51243 1727204718.09787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189b8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189282c0> <<< 51243 1727204718.09812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 51243 1727204718.09850: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.09874: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189b8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189b8bf0> <<< 51243 1727204718.09902: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.09914: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189b8fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18926de0> <<< 51243 1727204718.09946: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.09977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 51243 1727204718.10006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 51243 1727204718.10028: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189b9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189b9340> import 'importlib.machinery' # <<< 51243 1727204718.10071: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 51243 1727204718.10105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189ba570> <<< 51243 1727204718.10117: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 51243 1727204718.10137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 51243 1727204718.10200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 51243 1727204718.10203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 51243 1727204718.10230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189d47a0> import 'errno' # <<< 51243 1727204718.10263: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.10285: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189d5ee0> <<< 51243 1727204718.10306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 51243 1727204718.10341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 51243 1727204718.10355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189d6d80> <<< 51243 1727204718.10398: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.10422: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189d73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189d62d0> <<< 51243 1727204718.10446: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 51243 1727204718.10457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 51243 1727204718.10493: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.10513: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189d7e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189d7560> <<< 51243 1727204718.10570: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189ba5d0> <<< 51243 1727204718.10584: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 51243 1727204718.10612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 51243 1727204718.10635: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 51243 1727204718.10660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 51243 1727204718.10695: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18717da0> <<< 51243 1727204718.10724: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 51243 1727204718.10747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 51243 1727204718.10772: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18740860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187405c0> <<< 51243 1727204718.10803: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18740770> <<< 51243 1727204718.10836: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc187409b0> <<< 51243 1727204718.10856: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18715f40> <<< 51243 1727204718.10873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 51243 1727204718.11006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 51243 1727204718.11037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 51243 1727204718.11053: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18742090> <<< 51243 1727204718.11077: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18740d10> <<< 51243 1727204718.11092: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189bacc0> <<< 51243 1727204718.11117: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 51243 1727204718.11177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.11199: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 51243 1727204718.11250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 51243 1727204718.11274: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1876e420> <<< 51243 1727204718.11337: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 51243 1727204718.11374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.11377: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 51243 1727204718.11392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 51243 1727204718.11451: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18786540> <<< 51243 1727204718.11462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 51243 1727204718.11509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 51243 1727204718.11573: stdout chunk (state=3): >>>import 'ntpath' # <<< 51243 1727204718.11604: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187bf2f0> <<< 51243 1727204718.11622: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 51243 1727204718.11662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 51243 1727204718.11687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 51243 1727204718.11733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 51243 1727204718.11830: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187e5a90> <<< 51243 1727204718.11919: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187bf410> <<< 51243 1727204718.11958: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187871d0> <<< 51243 1727204718.12002: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 51243 1727204718.12019: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185c4410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18785580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18742fc0> <<< 51243 1727204718.12206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 51243 1727204718.12217: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcc18785940> <<< 51243 1727204718.12403: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_iqtr5jbd/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 51243 1727204718.12569: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.12598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 51243 1727204718.12644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 51243 1727204718.12750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 51243 1727204718.12767: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1862e120> <<< 51243 1727204718.12789: stdout chunk (state=3): >>>import '_typing' # <<< 51243 1727204718.12993: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18605010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18604170> <<< 51243 1727204718.12997: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.13039: stdout chunk (state=3): >>>import 'ansible' # <<< 51243 1727204718.13078: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51243 1727204718.13084: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.13104: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 51243 1727204718.14738: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.16133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18607fb0> <<< 51243 1727204718.16144: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.16199: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 51243 1727204718.16233: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 51243 1727204718.16237: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18661b20> <<< 51243 1727204718.16288: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc186618b0> <<< 51243 1727204718.16341: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc186611f0> <<< 51243 1727204718.16345: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 51243 1727204718.16403: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18661c40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1862eb40> import 'atexit' # <<< 51243 1727204718.16437: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc186628a0> <<< 51243 1727204718.16460: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18662ae0> <<< 51243 1727204718.16491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 51243 1727204718.16557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 51243 1727204718.16572: stdout chunk (state=3): >>>import '_locale' # <<< 51243 1727204718.16632: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18662ff0> <<< 51243 1727204718.16650: stdout chunk (state=3): >>>import 'pwd' # <<< 51243 1727204718.16674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 51243 1727204718.16720: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c4e00> <<< 51243 1727204718.16767: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc184c6a20> <<< 51243 1727204718.16792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 51243 1727204718.16803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 51243 1727204718.16876: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c7320> <<< 51243 1727204718.16879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 51243 1727204718.16914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 51243 1727204718.16922: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c8500> <<< 51243 1727204718.16952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 51243 1727204718.16979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 51243 1727204718.17006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 51243 1727204718.17079: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184caf90> <<< 51243 1727204718.19917: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc184cb2c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c9250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184ceea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184cd9a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184cd700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184cfda0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c9760> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18512f90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18513200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc1851ccb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1851ca70> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc1851f200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1851d3a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpyth<<< 51243 1727204718.19962: stdout chunk (state=3): >>>on-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185229f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1851f380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc185237d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18523a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18523b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185133b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc185272c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18528770> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18525a90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18526e10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18525700> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204718.20009: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.20686: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.21687: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 51243 1727204718.21693: stdout chunk (state=3): >>> <<< 51243 1727204718.21716: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 51243 1727204718.21735: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 51243 1727204718.21741: stdout chunk (state=3): >>> <<< 51243 1727204718.21760: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 51243 1727204718.21764: stdout chunk (state=3): >>> <<< 51243 1727204718.21814: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 51243 1727204718.21818: stdout chunk (state=3): >>> <<< 51243 1727204718.21851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 51243 1727204718.21941: stdout chunk (state=3): >>> # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.21975: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.21978: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc183b08f0><<< 51243 1727204718.21980: stdout chunk (state=3): >>> <<< 51243 1727204718.22140: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 51243 1727204718.22190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183b1730><<< 51243 1727204718.22195: stdout chunk (state=3): >>> <<< 51243 1727204718.22221: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185241a0><<< 51243 1727204718.22298: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.selinux' # <<< 51243 1727204718.22306: stdout chunk (state=3): >>> <<< 51243 1727204718.22334: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.22341: stdout chunk (state=3): >>> <<< 51243 1727204718.22382: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.22387: stdout chunk (state=3): >>> <<< 51243 1727204718.22414: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 51243 1727204718.22447: stdout chunk (state=3): >>> # zipimport: zlib available <<< 51243 1727204718.22750: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.22800: stdout chunk (state=3): >>> <<< 51243 1727204718.23047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 51243 1727204718.23077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc'<<< 51243 1727204718.23080: stdout chunk (state=3): >>> <<< 51243 1727204718.23103: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183b1700><<< 51243 1727204718.23138: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.23141: stdout chunk (state=3): >>> <<< 51243 1727204718.24324: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.25039: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.25178: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.25317: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 51243 1727204718.25348: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.25354: stdout chunk (state=3): >>> <<< 51243 1727204718.25426: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.25491: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 51243 1727204718.25527: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.25532: stdout chunk (state=3): >>> <<< 51243 1727204718.25660: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.25823: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 51243 1727204718.25855: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.25889: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.25915: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 51243 1727204718.25953: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.26076: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 51243 1727204718.26108: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.26111: stdout chunk (state=3): >>> <<< 51243 1727204718.26588: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.26595: stdout chunk (state=3): >>> <<< 51243 1727204718.27063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 51243 1727204718.27175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 51243 1727204718.27185: stdout chunk (state=3): >>>import '_ast' # <<< 51243 1727204718.27302: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183b2480> <<< 51243 1727204718.27318: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.27442: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.27555: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 51243 1727204718.27592: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 51243 1727204718.27683: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.27825: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc183ba2a0> <<< 51243 1727204718.27892: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc183bac00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185288f0> <<< 51243 1727204718.27921: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.27976: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.28025: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 51243 1727204718.28029: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.28075: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.28120: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.28182: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.28261: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 51243 1727204718.28316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.28439: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc183b9910> <<< 51243 1727204718.28472: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183bad80> <<< 51243 1727204718.28513: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 51243 1727204718.28524: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.28584: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.28667: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.28895: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 51243 1727204718.29121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 51243 1727204718.29146: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18452e40> <<< 51243 1727204718.29225: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183c4b60> <<< 51243 1727204718.29357: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183c2c60> <<< 51243 1727204718.29371: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183c2ab0> <<< 51243 1727204718.29389: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 51243 1727204718.29422: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.29470: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.29531: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 51243 1727204718.29547: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 51243 1727204718.29837: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 51243 1727204718.29842: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.29844: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.29950: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.29981: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.30019: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.30101: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.30162: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.30239: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.30289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 51243 1727204718.30320: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.30635: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51243 1727204718.30639: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.30707: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 51243 1727204718.30711: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.31051: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.31395: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.31425: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.31520: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204718.31561: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 51243 1727204718.31591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 51243 1727204718.31637: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18459cd0> <<< 51243 1727204718.31690: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 51243 1727204718.31706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 51243 1727204718.31961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 51243 1727204718.32133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179504a0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc17950800> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184394f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18438740> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184583e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18458080> <<< 51243 1727204718.32302: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.32599: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc17953800> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179530b0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc17953290> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179524e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17953920> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc179ba450> <<< 51243 1727204718.32673: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179b8470> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18458110> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 51243 1727204718.32720: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.32908: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 51243 1727204718.32927: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.32956: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 51243 1727204718.32985: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 51243 1727204718.33060: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 51243 1727204718.33230: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.33268: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204718.33292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 51243 1727204718.33359: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.33415: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.33489: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.33542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 51243 1727204718.33559: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 51243 1727204718.34425: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.35130: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204718.35206: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.35253: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.35292: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 51243 1727204718.35319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 51243 1727204718.35361: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.35390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 51243 1727204718.35487: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51243 1727204718.35580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 51243 1727204718.35599: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.35645: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.35693: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 51243 1727204718.35774: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 51243 1727204718.35825: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.35945: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.36088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 51243 1727204718.36113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 51243 1727204718.36152: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179bbd40> <<< 51243 1727204718.36208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 51243 1727204718.36403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 51243 1727204718.36464: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179bb380> <<< 51243 1727204718.36478: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 51243 1727204718.36527: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.36623: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.36735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 51243 1727204718.36756: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.36916: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.37068: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 51243 1727204718.37093: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.37211: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.37350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 51243 1727204718.37377: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.37417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 51243 1727204718.37479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 51243 1727204718.37554: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.37636: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc179ea7e0> <<< 51243 1727204718.37850: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179d6de0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 51243 1727204718.37943: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.37985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 51243 1727204718.38006: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.38081: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.38761: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 51243 1727204718.38849: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 51243 1727204718.38922: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204718.38927: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc178063f0> <<< 51243 1727204718.38938: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17806000> import 'ansible.module_utils.facts.system.user' # <<< 51243 1727204718.38948: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.38969: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 51243 1727204718.38978: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.39038: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.39099: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 51243 1727204718.39391: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.39654: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 51243 1727204718.39920: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.40025: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.40033: stdout chunk (state=3): >>> <<< 51243 1727204718.40094: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.40120: stdout chunk (state=3): >>> <<< 51243 1727204718.40185: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 51243 1727204718.40191: stdout chunk (state=3): >>> <<< 51243 1727204718.40215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 51243 1727204718.40221: stdout chunk (state=3): >>> <<< 51243 1727204718.40274: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 51243 1727204718.40280: stdout chunk (state=3): >>> <<< 51243 1727204718.40313: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.40499: stdout chunk (state=3): >>> <<< 51243 1727204718.40596: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.40605: stdout chunk (state=3): >>> <<< 51243 1727204718.40864: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 51243 1727204718.40874: stdout chunk (state=3): >>> <<< 51243 1727204718.40893: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 51243 1727204718.40926: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.40934: stdout chunk (state=3): >>> <<< 51243 1727204718.41162: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.41164: stdout chunk (state=3): >>> <<< 51243 1727204718.41389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 51243 1727204718.41395: stdout chunk (state=3): >>> <<< 51243 1727204718.41430: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.41432: stdout chunk (state=3): >>> <<< 51243 1727204718.41489: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.41495: stdout chunk (state=3): >>> <<< 51243 1727204718.41568: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.41574: stdout chunk (state=3): >>> <<< 51243 1727204718.42681: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.43696: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 51243 1727204718.43740: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available<<< 51243 1727204718.43747: stdout chunk (state=3): >>> <<< 51243 1727204718.44003: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.44136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 51243 1727204718.44159: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.44169: stdout chunk (state=3): >>> <<< 51243 1727204718.44349: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.44352: stdout chunk (state=3): >>> <<< 51243 1727204718.44535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 51243 1727204718.44555: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.44564: stdout chunk (state=3): >>> <<< 51243 1727204718.44903: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.45129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 51243 1727204718.45160: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.45194: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.45218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 51243 1727204718.45248: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.45254: stdout chunk (state=3): >>> <<< 51243 1727204718.45383: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 51243 1727204718.45391: stdout chunk (state=3): >>> <<< 51243 1727204718.45414: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.45418: stdout chunk (state=3): >>> <<< 51243 1727204718.45591: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.45597: stdout chunk (state=3): >>> <<< 51243 1727204718.45772: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.46005: stdout chunk (state=3): >>> <<< 51243 1727204718.46187: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.46190: stdout chunk (state=3): >>> <<< 51243 1727204718.46580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 51243 1727204718.46589: stdout chunk (state=3): >>> <<< 51243 1727204718.46606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 51243 1727204718.46615: stdout chunk (state=3): >>> <<< 51243 1727204718.46639: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.46645: stdout chunk (state=3): >>> <<< 51243 1727204718.46708: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.46714: stdout chunk (state=3): >>> <<< 51243 1727204718.46775: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 51243 1727204718.46778: stdout chunk (state=3): >>> <<< 51243 1727204718.46806: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.46811: stdout chunk (state=3): >>> <<< 51243 1727204718.46853: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.46859: stdout chunk (state=3): >>> <<< 51243 1727204718.46894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 51243 1727204718.46904: stdout chunk (state=3): >>> <<< 51243 1727204718.46934: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.46937: stdout chunk (state=3): >>> <<< 51243 1727204718.47062: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.47070: stdout chunk (state=3): >>> <<< 51243 1727204718.47190: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 51243 1727204718.47198: stdout chunk (state=3): >>> <<< 51243 1727204718.47226: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.47229: stdout chunk (state=3): >>> <<< 51243 1727204718.47272: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.47279: stdout chunk (state=3): >>> <<< 51243 1727204718.47318: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 51243 1727204718.47324: stdout chunk (state=3): >>> <<< 51243 1727204718.47344: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.47408: stdout chunk (state=3): >>> <<< 51243 1727204718.47463: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.47468: stdout chunk (state=3): >>> <<< 51243 1727204718.47570: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 51243 1727204718.47576: stdout chunk (state=3): >>> <<< 51243 1727204718.47589: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.47691: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.47697: stdout chunk (state=3): >>> <<< 51243 1727204718.47793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 51243 1727204718.47799: stdout chunk (state=3): >>> <<< 51243 1727204718.47828: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.48376: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.48898: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 51243 1727204718.48926: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.49038: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.49041: stdout chunk (state=3): >>> <<< 51243 1727204718.49142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 51243 1727204718.49150: stdout chunk (state=3): >>> <<< 51243 1727204718.49170: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.49177: stdout chunk (state=3): >>> <<< 51243 1727204718.49240: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.49245: stdout chunk (state=3): >>> <<< 51243 1727204718.49307: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 51243 1727204718.49310: stdout chunk (state=3): >>> <<< 51243 1727204718.49335: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.49339: stdout chunk (state=3): >>> <<< 51243 1727204718.49394: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.49401: stdout chunk (state=3): >>> <<< 51243 1727204718.49469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available<<< 51243 1727204718.49526: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.49532: stdout chunk (state=3): >>> <<< 51243 1727204718.49585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 51243 1727204718.49616: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.49761: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.49800: stdout chunk (state=3): >>> <<< 51243 1727204718.49898: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 51243 1727204718.49903: stdout chunk (state=3): >>> <<< 51243 1727204718.49927: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.49952: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.49963: stdout chunk (state=3): >>> <<< 51243 1727204718.49971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 51243 1727204718.50003: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.50072: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.50089: stdout chunk (state=3): >>> <<< 51243 1727204718.50162: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 51243 1727204718.50168: stdout chunk (state=3): >>> <<< 51243 1727204718.50190: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.50196: stdout chunk (state=3): >>> <<< 51243 1727204718.50235: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.50245: stdout chunk (state=3): >>> <<< 51243 1727204718.50298: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.50357: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.50363: stdout chunk (state=3): >>> <<< 51243 1727204718.50503: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.50578: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.50584: stdout chunk (state=3): >>> <<< 51243 1727204718.50703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 51243 1727204718.50715: stdout chunk (state=3): >>> <<< 51243 1727204718.50735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 51243 1727204718.50748: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 51243 1727204718.50776: stdout chunk (state=3): >>> # zipimport: zlib available<<< 51243 1727204718.50779: stdout chunk (state=3): >>> <<< 51243 1727204718.50935: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 51243 1727204718.50960: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.50967: stdout chunk (state=3): >>> <<< 51243 1727204718.51356: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.51743: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 51243 1727204718.51770: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204718.51921: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # <<< 51243 1727204718.52104: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 51243 1727204718.52234: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.52241: stdout chunk (state=3): >>> <<< 51243 1727204718.52376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 51243 1727204718.52394: stdout chunk (state=3): >>> <<< 51243 1727204718.52398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 51243 1727204718.52401: stdout chunk (state=3): >>> <<< 51243 1727204718.52572: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 51243 1727204718.52578: stdout chunk (state=3): >>> <<< 51243 1727204718.52736: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 51243 1727204718.52747: stdout chunk (state=3): >>> <<< 51243 1727204718.52763: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # <<< 51243 1727204718.52773: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 51243 1727204718.52788: stdout chunk (state=3): >>> <<< 51243 1727204718.52915: stdout chunk (state=3): >>># zipimport: zlib available<<< 51243 1727204718.52921: stdout chunk (state=3): >>> <<< 51243 1727204718.53687: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py<<< 51243 1727204718.53694: stdout chunk (state=3): >>> <<< 51243 1727204718.53711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 51243 1727204718.53751: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py<<< 51243 1727204718.53755: stdout chunk (state=3): >>> <<< 51243 1727204718.53791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc'<<< 51243 1727204718.53796: stdout chunk (state=3): >>> <<< 51243 1727204718.53846: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 51243 1727204718.53890: stdout chunk (state=3): >>> import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc1782ee40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1782e960> <<< 51243 1727204718.53978: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1782e750><<< 51243 1727204718.54047: stdout chunk (state=3): >>> <<< 51243 1727204718.76218: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 51243 1727204718.76222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 51243 1727204718.76268: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17874a40> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 51243 1727204718.76305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17875e20> <<< 51243 1727204718.76354: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 51243 1727204718.76385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 51243 1727204718.76438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179e05f0> <<< 51243 1727204718.76465: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17877c20> <<< 51243 1727204718.76757: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 51243 1727204718.98057: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_<<< 51243 1727204718.98084: stdout chunk (state=3): >>>ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 746, "free": 2970}, "nocache": {"free": 3424, "used": 292}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1056, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251292233728, "block_size": 4096, "block_total": 64479564, "block_available": 61350643, "block_used": 3128921, "inode_total": 16384000, "inode_available": 16301237, "inode_used": 82763, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "18", "epoch": "1727204718", "epoch_int": "1727204718", "date": "2024-09-24", "time": "15:05:18", "iso8601_micro": "2024-09-24T19:05:18.966585Z", "iso8601": "2024-09-24T19:05:18Z", "iso8601_basic": "20240924T150518966585", "iso8601_basic_short": "20240924T150518", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ho<<< 51243 1727204718.98092: stdout chunk (state=3): >>>stnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.44287109375, "5m": 0.57373046875, "15m": 0.4306640625}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 51243 1727204718.99221: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path<<< 51243 1727204718.99256: stdout chunk (state=3): >>> # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re <<< 51243 1727204718.99294: stdout chunk (state=3): >>># cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math<<< 51243 1727204718.99317: stdout chunk (state=3): >>> # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading<<< 51243 1727204718.99345: stdout chunk (state=3): >>> # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path<<< 51243 1727204718.99367: stdout chunk (state=3): >>> # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils<<< 51243 1727204718.99392: stdout chunk (state=3): >>> # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp <<< 51243 1727204718.99414: stdout chunk (state=3): >>># cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token<<< 51243 1727204718.99442: stdout chunk (state=3): >>> # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime<<< 51243 1727204718.99461: stdout chunk (state=3): >>> # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket<<< 51243 1727204718.99485: stdout chunk (state=3): >>> # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text<<< 51243 1727204718.99510: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 51243 1727204718.99544: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings<<< 51243 1727204718.99564: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool<<< 51243 1727204718.99573: stdout chunk (state=3): >>> # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters<<< 51243 1727204718.99756: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansi<<< 51243 1727204718.99763: stdout chunk (state=3): >>>ble.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy<<< 51243 1727204718.99767: stdout chunk (state=3): >>> ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 51243 1727204719.00494: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 51243 1727204719.00500: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 51243 1727204719.00538: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression <<< 51243 1727204719.00556: stdout chunk (state=3): >>># destroy _lzma <<< 51243 1727204719.00560: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 51243 1727204719.00583: stdout chunk (state=3): >>># destroy zipfile._path <<< 51243 1727204719.00589: stdout chunk (state=3): >>># destroy zipfile <<< 51243 1727204719.00813: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob <<< 51243 1727204719.00822: stdout chunk (state=3): >>># destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 51243 1727204719.00846: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 51243 1727204719.00912: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 51243 1727204719.00929: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 51243 1727204719.00938: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle <<< 51243 1727204719.00979: stdout chunk (state=3): >>># destroy _pickle <<< 51243 1727204719.00996: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue <<< 51243 1727204719.01004: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 51243 1727204719.01010: stdout chunk (state=3): >>># destroy selectors <<< 51243 1727204719.01040: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 51243 1727204719.01076: stdout chunk (state=3): >>># destroy datetime <<< 51243 1727204719.01079: stdout chunk (state=3): >>># destroy subprocess <<< 51243 1727204719.01081: stdout chunk (state=3): >>># destroy base64 <<< 51243 1727204719.01108: stdout chunk (state=3): >>># destroy _ssl <<< 51243 1727204719.01148: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 51243 1727204719.01161: stdout chunk (state=3): >>># destroy pwd # destroy termios <<< 51243 1727204719.01176: stdout chunk (state=3): >>># destroy json <<< 51243 1727204719.01446: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 51243 1727204719.01450: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct <<< 51243 1727204719.01455: stdout chunk (state=3): >>># cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 51243 1727204719.01486: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 51243 1727204719.01490: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 51243 1727204719.01522: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 51243 1727204719.01530: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 51243 1727204719.01568: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 51243 1727204719.01573: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases <<< 51243 1727204719.01585: stdout chunk (state=3): >>># cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 51243 1727204719.01608: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref <<< 51243 1727204719.01614: stdout chunk (state=3): >>># cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 51243 1727204719.01645: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 51243 1727204719.01652: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 51243 1727204719.02219: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 51243 1727204719.02310: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 51243 1727204719.02329: stdout chunk (state=3): >>># destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback <<< 51243 1727204719.02345: stdout chunk (state=3): >>># destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 51243 1727204719.02353: stdout chunk (state=3): >>> # destroy math # destroy _bisect <<< 51243 1727204719.02358: stdout chunk (state=3): >>># destroy time <<< 51243 1727204719.02402: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 51243 1727204719.02442: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre <<< 51243 1727204719.02449: stdout chunk (state=3): >>># destroy _string # destroy re <<< 51243 1727204719.02472: stdout chunk (state=3): >>># destroy itertools <<< 51243 1727204719.02493: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 51243 1727204719.02499: stdout chunk (state=3): >>># destroy _thread <<< 51243 1727204719.02516: stdout chunk (state=3): >>># clear sys.audit hooks <<< 51243 1727204719.03187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 51243 1727204719.03255: stderr chunk (state=3): >>><<< 51243 1727204719.03258: stdout chunk (state=3): >>><<< 51243 1727204719.03363: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18b18530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18ae7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18b1aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc188ed190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc188ee090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1892be60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1892bf20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18963830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18963ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18943b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18941250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18929010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18987800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18986420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18942120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18984c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189b8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189282c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189b8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189b8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189b8fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18926de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189b9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189b9340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189ba570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189d47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189d5ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189d6d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189d73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189d62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc189d7e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189d7560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189ba5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18717da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18740860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187405c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18740770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc187409b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18715f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18742090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18740d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc189bacc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1876e420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18786540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187bf2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187e5a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187bf410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc187871d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185c4410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18785580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18742fc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcc18785940> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_iqtr5jbd/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1862e120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18605010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18604170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18607fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18661b20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc186618b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc186611f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18661c40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1862eb40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc186628a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18662ae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18662ff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c4e00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc184c6a20> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c7320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c8500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184caf90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc184cb2c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c9250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184ceea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184cd9a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184cd700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184cfda0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184c9760> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18512f90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18513200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc1851ccb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1851ca70> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc1851f200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1851d3a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185229f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1851f380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc185237d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18523a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18523b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185133b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc185272c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18528770> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18525a90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc18526e10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18525700> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc183b08f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183b1730> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185241a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183b1700> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183b2480> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc183ba2a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc183bac00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc185288f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc183b9910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183bad80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18452e40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183c4b60> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183c2c60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc183c2ab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18459cd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179504a0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc17950800> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184394f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18438740> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc184583e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18458080> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc17953800> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179530b0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc17953290> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179524e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17953920> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc179ba450> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179b8470> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc18458110> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179bbd40> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179bb380> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc179ea7e0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179d6de0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc178063f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17806000> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcc1782ee40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1782e960> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc1782e750> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17874a40> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17875e20> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc179e05f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcc17877c20> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 746, "free": 2970}, "nocache": {"free": 3424, "used": 292}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1056, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251292233728, "block_size": 4096, "block_total": 64479564, "block_available": 61350643, "block_used": 3128921, "inode_total": 16384000, "inode_available": 16301237, "inode_used": 82763, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "18", "epoch": "1727204718", "epoch_int": "1727204718", "date": "2024-09-24", "time": "15:05:18", "iso8601_micro": "2024-09-24T19:05:18.966585Z", "iso8601": "2024-09-24T19:05:18Z", "iso8601_basic": "20240924T150518966585", "iso8601_basic_short": "20240924T150518", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.44287109375, "5m": 0.57373046875, "15m": 0.4306640625}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 51243 1727204719.04257: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51243 1727204719.04261: _low_level_execute_command(): starting 51243 1727204719.04263: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204717.4584084-51322-32663684843012/ > /dev/null 2>&1 && sleep 0' 51243 1727204719.04518: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204719.04522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204719.04525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204719.04527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204719.04585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204719.04589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204719.04683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204719.07470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204719.07527: stderr chunk (state=3): >>><<< 51243 1727204719.07535: stdout chunk (state=3): >>><<< 51243 1727204719.07549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51243 1727204719.07558: handler run complete 51243 1727204719.07656: variable 'ansible_facts' from source: unknown 51243 1727204719.07743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.07985: variable 'ansible_facts' from source: unknown 51243 1727204719.08054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.08152: attempt loop complete, returning result 51243 1727204719.08156: _execute() done 51243 1727204719.08159: dumping result to json 51243 1727204719.08182: done dumping result, returning 51243 1727204719.08192: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-5c5d-847b-000000000147] 51243 1727204719.08200: sending task result for task 127b8e07-fff9-5c5d-847b-000000000147 51243 1727204719.08482: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000147 51243 1727204719.08485: WORKER PROCESS EXITING ok: [managed-node3] 51243 1727204719.08795: no more pending results, returning what we have 51243 1727204719.08798: results queue empty 51243 1727204719.08798: checking for any_errors_fatal 51243 1727204719.08799: done checking for any_errors_fatal 51243 1727204719.08800: checking for max_fail_percentage 51243 1727204719.08801: done checking for max_fail_percentage 51243 1727204719.08801: checking to see if all hosts have failed and the running result is not ok 51243 1727204719.08802: done checking to see if all hosts have failed 51243 1727204719.08803: getting the remaining hosts for this loop 51243 1727204719.08804: done getting the remaining hosts for this loop 51243 1727204719.08807: getting the next task for host managed-node3 51243 1727204719.08812: done getting next task for host managed-node3 51243 1727204719.08813: ^ task is: TASK: meta (flush_handlers) 51243 1727204719.08814: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204719.08818: getting variables 51243 1727204719.08819: in VariableManager get_vars() 51243 1727204719.08839: Calling all_inventory to load vars for managed-node3 51243 1727204719.08841: Calling groups_inventory to load vars for managed-node3 51243 1727204719.08843: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204719.08851: Calling all_plugins_play to load vars for managed-node3 51243 1727204719.08853: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204719.08855: Calling groups_plugins_play to load vars for managed-node3 51243 1727204719.08992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.09136: done with get_vars() 51243 1727204719.09146: done getting variables 51243 1727204719.09201: in VariableManager get_vars() 51243 1727204719.09208: Calling all_inventory to load vars for managed-node3 51243 1727204719.09210: Calling groups_inventory to load vars for managed-node3 51243 1727204719.09212: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204719.09217: Calling all_plugins_play to load vars for managed-node3 51243 1727204719.09220: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204719.09223: Calling groups_plugins_play to load vars for managed-node3 51243 1727204719.09340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.09476: done with get_vars() 51243 1727204719.09488: done queuing things up, now waiting for results queue to drain 51243 1727204719.09490: results queue empty 51243 1727204719.09491: checking for any_errors_fatal 51243 1727204719.09492: done checking for any_errors_fatal 51243 1727204719.09493: checking for max_fail_percentage 51243 1727204719.09494: done checking for max_fail_percentage 51243 1727204719.09498: checking to see if all hosts have failed and the running result is not ok 51243 1727204719.09499: done checking to see if all hosts have failed 51243 1727204719.09499: getting the remaining hosts for this loop 51243 1727204719.09500: done getting the remaining hosts for this loop 51243 1727204719.09502: getting the next task for host managed-node3 51243 1727204719.09506: done getting next task for host managed-node3 51243 1727204719.09508: ^ task is: TASK: Include the task 'el_repo_setup.yml' 51243 1727204719.09509: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204719.09511: getting variables 51243 1727204719.09511: in VariableManager get_vars() 51243 1727204719.09517: Calling all_inventory to load vars for managed-node3 51243 1727204719.09519: Calling groups_inventory to load vars for managed-node3 51243 1727204719.09520: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204719.09524: Calling all_plugins_play to load vars for managed-node3 51243 1727204719.09526: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204719.09527: Calling groups_plugins_play to load vars for managed-node3 51243 1727204719.09627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.09846: done with get_vars() 51243 1727204719.09855: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Tuesday 24 September 2024 15:05:19 -0400 (0:00:01.693) 0:00:01.707 ***** 51243 1727204719.09946: entering _queue_task() for managed-node3/include_tasks 51243 1727204719.09948: Creating lock for include_tasks 51243 1727204719.10326: worker is 1 (out of 1 available) 51243 1727204719.10344: exiting _queue_task() for managed-node3/include_tasks 51243 1727204719.10359: done queuing things up, now waiting for results queue to drain 51243 1727204719.10361: waiting for pending results... 51243 1727204719.10791: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 51243 1727204719.10796: in run() - task 127b8e07-fff9-5c5d-847b-000000000006 51243 1727204719.10799: variable 'ansible_search_path' from source: unknown 51243 1727204719.10809: calling self._execute() 51243 1727204719.10942: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204719.10956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204719.10960: variable 'omit' from source: magic vars 51243 1727204719.11051: _execute() done 51243 1727204719.11055: dumping result to json 51243 1727204719.11058: done dumping result, returning 51243 1727204719.11064: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-5c5d-847b-000000000006] 51243 1727204719.11072: sending task result for task 127b8e07-fff9-5c5d-847b-000000000006 51243 1727204719.11227: no more pending results, returning what we have 51243 1727204719.11234: in VariableManager get_vars() 51243 1727204719.11274: Calling all_inventory to load vars for managed-node3 51243 1727204719.11277: Calling groups_inventory to load vars for managed-node3 51243 1727204719.11281: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204719.11295: Calling all_plugins_play to load vars for managed-node3 51243 1727204719.11298: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204719.11301: Calling groups_plugins_play to load vars for managed-node3 51243 1727204719.11500: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000006 51243 1727204719.11504: WORKER PROCESS EXITING 51243 1727204719.11514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.11652: done with get_vars() 51243 1727204719.11659: variable 'ansible_search_path' from source: unknown 51243 1727204719.11673: we have included files to process 51243 1727204719.11674: generating all_blocks data 51243 1727204719.11675: done generating all_blocks data 51243 1727204719.11675: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 51243 1727204719.11676: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 51243 1727204719.11678: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 51243 1727204719.12170: in VariableManager get_vars() 51243 1727204719.12182: done with get_vars() 51243 1727204719.12190: done processing included file 51243 1727204719.12191: iterating over new_blocks loaded from include file 51243 1727204719.12192: in VariableManager get_vars() 51243 1727204719.12198: done with get_vars() 51243 1727204719.12199: filtering new block on tags 51243 1727204719.12210: done filtering new block on tags 51243 1727204719.12212: in VariableManager get_vars() 51243 1727204719.12217: done with get_vars() 51243 1727204719.12218: filtering new block on tags 51243 1727204719.12228: done filtering new block on tags 51243 1727204719.12230: in VariableManager get_vars() 51243 1727204719.12258: done with get_vars() 51243 1727204719.12259: filtering new block on tags 51243 1727204719.12272: done filtering new block on tags 51243 1727204719.12274: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 51243 1727204719.12279: extending task lists for all hosts with included blocks 51243 1727204719.12310: done extending task lists 51243 1727204719.12311: done processing included files 51243 1727204719.12312: results queue empty 51243 1727204719.12312: checking for any_errors_fatal 51243 1727204719.12313: done checking for any_errors_fatal 51243 1727204719.12314: checking for max_fail_percentage 51243 1727204719.12314: done checking for max_fail_percentage 51243 1727204719.12315: checking to see if all hosts have failed and the running result is not ok 51243 1727204719.12315: done checking to see if all hosts have failed 51243 1727204719.12316: getting the remaining hosts for this loop 51243 1727204719.12317: done getting the remaining hosts for this loop 51243 1727204719.12318: getting the next task for host managed-node3 51243 1727204719.12321: done getting next task for host managed-node3 51243 1727204719.12322: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 51243 1727204719.12324: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204719.12325: getting variables 51243 1727204719.12326: in VariableManager get_vars() 51243 1727204719.12332: Calling all_inventory to load vars for managed-node3 51243 1727204719.12335: Calling groups_inventory to load vars for managed-node3 51243 1727204719.12337: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204719.12341: Calling all_plugins_play to load vars for managed-node3 51243 1727204719.12342: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204719.12344: Calling groups_plugins_play to load vars for managed-node3 51243 1727204719.12447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.12582: done with get_vars() 51243 1727204719.12589: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:05:19 -0400 (0:00:00.026) 0:00:01.734 ***** 51243 1727204719.12640: entering _queue_task() for managed-node3/setup 51243 1727204719.12908: worker is 1 (out of 1 available) 51243 1727204719.12922: exiting _queue_task() for managed-node3/setup 51243 1727204719.12936: done queuing things up, now waiting for results queue to drain 51243 1727204719.12938: waiting for pending results... 51243 1727204719.13099: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 51243 1727204719.13172: in run() - task 127b8e07-fff9-5c5d-847b-000000000158 51243 1727204719.13181: variable 'ansible_search_path' from source: unknown 51243 1727204719.13185: variable 'ansible_search_path' from source: unknown 51243 1727204719.13219: calling self._execute() 51243 1727204719.13286: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204719.13296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204719.13302: variable 'omit' from source: magic vars 51243 1727204719.13736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51243 1727204719.16074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51243 1727204719.16078: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51243 1727204719.16103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51243 1727204719.16355: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51243 1727204719.16393: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51243 1727204719.16489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51243 1727204719.16526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51243 1727204719.16558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51243 1727204719.16618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51243 1727204719.16640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51243 1727204719.16824: variable 'ansible_facts' from source: unknown 51243 1727204719.16906: variable 'network_test_required_facts' from source: task vars 51243 1727204719.16958: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 51243 1727204719.16984: variable 'omit' from source: magic vars 51243 1727204719.17031: variable 'omit' from source: magic vars 51243 1727204719.17271: variable 'omit' from source: magic vars 51243 1727204719.17274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51243 1727204719.17281: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51243 1727204719.17284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51243 1727204719.17286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204719.17289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204719.17291: variable 'inventory_hostname' from source: host vars for 'managed-node3' 51243 1727204719.17293: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204719.17296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204719.17354: Set connection var ansible_shell_type to sh 51243 1727204719.17375: Set connection var ansible_module_compression to ZIP_DEFLATED 51243 1727204719.17384: Set connection var ansible_connection to ssh 51243 1727204719.17399: Set connection var ansible_pipelining to False 51243 1727204719.17409: Set connection var ansible_shell_executable to /bin/sh 51243 1727204719.17418: Set connection var ansible_timeout to 10 51243 1727204719.17447: variable 'ansible_shell_executable' from source: unknown 51243 1727204719.17454: variable 'ansible_connection' from source: unknown 51243 1727204719.17461: variable 'ansible_module_compression' from source: unknown 51243 1727204719.17469: variable 'ansible_shell_type' from source: unknown 51243 1727204719.17476: variable 'ansible_shell_executable' from source: unknown 51243 1727204719.17482: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204719.17490: variable 'ansible_pipelining' from source: unknown 51243 1727204719.17497: variable 'ansible_timeout' from source: unknown 51243 1727204719.17506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204719.17667: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51243 1727204719.17684: variable 'omit' from source: magic vars 51243 1727204719.17692: starting attempt loop 51243 1727204719.17699: running the handler 51243 1727204719.17716: _low_level_execute_command(): starting 51243 1727204719.17729: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51243 1727204719.18485: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51243 1727204719.18522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51243 1727204719.18542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204719.18565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51243 1727204719.18595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 51243 1727204719.18696: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204719.18717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204719.18800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204719.20968: stdout chunk (state=3): >>>/root <<< 51243 1727204719.21206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204719.21222: stdout chunk (state=3): >>><<< 51243 1727204719.21254: stderr chunk (state=3): >>><<< 51243 1727204719.21289: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51243 1727204719.21317: _low_level_execute_command(): starting 51243 1727204719.21329: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716 `" && echo ansible-tmp-1727204719.213042-51359-100948063728716="` echo /root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716 `" ) && sleep 0' 51243 1727204719.22123: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51243 1727204719.22141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51243 1727204719.22155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204719.22179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51243 1727204719.22203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 51243 1727204719.22216: stderr chunk (state=3): >>>debug2: match not found <<< 51243 1727204719.22230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204719.22292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204719.22355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204719.22384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204719.22412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204719.22759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204719.24691: stdout chunk (state=3): >>>ansible-tmp-1727204719.213042-51359-100948063728716=/root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716 <<< 51243 1727204719.24996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204719.25201: stderr chunk (state=3): >>><<< 51243 1727204719.25207: stdout chunk (state=3): >>><<< 51243 1727204719.25285: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204719.213042-51359-100948063728716=/root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51243 1727204719.25335: variable 'ansible_module_compression' from source: unknown 51243 1727204719.25461: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-51243vpkpdts3/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 51243 1727204719.25621: variable 'ansible_facts' from source: unknown 51243 1727204719.26010: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/AnsiballZ_setup.py 51243 1727204719.26702: Sending initial data 51243 1727204719.26706: Sent initial data (153 bytes) 51243 1727204719.28053: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204719.28109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204719.30799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 51243 1727204719.30876: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 51243 1727204719.30981: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51243vpkpdts3/tmpzywyodsj /root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/AnsiballZ_setup.py <<< 51243 1727204719.30986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/AnsiballZ_setup.py" <<< 51243 1727204719.31112: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-51243vpkpdts3/tmpzywyodsj" to remote "/root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/AnsiballZ_setup.py" <<< 51243 1727204719.34295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204719.34307: stdout chunk (state=3): >>><<< 51243 1727204719.34338: stderr chunk (state=3): >>><<< 51243 1727204719.35043: done transferring module to remote 51243 1727204719.35047: _low_level_execute_command(): starting 51243 1727204719.35050: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/ /root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/AnsiballZ_setup.py && sleep 0' 51243 1727204719.36797: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204719.37217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204719.39696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204719.39755: stderr chunk (state=3): >>><<< 51243 1727204719.39771: stdout chunk (state=3): >>><<< 51243 1727204719.39797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 51243 1727204719.39805: _low_level_execute_command(): starting 51243 1727204719.39814: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/AnsiballZ_setup.py && sleep 0' 51243 1727204719.41182: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51243 1727204719.41288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204719.41349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204719.41518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204719.41602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 51243 1727204719.44204: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 51243 1727204719.44234: stdout chunk (state=3): >>>import _imp # builtin <<< 51243 1727204719.44339: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 51243 1727204719.44353: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 51243 1727204719.44374: stdout chunk (state=3): >>>import 'posix' # <<< 51243 1727204719.44401: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 51243 1727204719.44434: stdout chunk (state=3): >>>import 'time' # <<< 51243 1727204719.44498: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 51243 1727204719.44504: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.44572: stdout chunk (state=3): >>>import '_codecs' # <<< 51243 1727204719.44600: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 51243 1727204719.44624: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdda4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdd73b30> <<< 51243 1727204719.44691: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 51243 1727204719.44695: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdda6ab0> import '_signal' # <<< 51243 1727204719.44743: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 51243 1727204719.44800: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 51243 1727204719.44958: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 51243 1727204719.44980: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 51243 1727204719.45029: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 51243 1727204719.45056: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 51243 1727204719.45148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb551c0> <<< 51243 1727204719.45173: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.45184: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb560c0> <<< 51243 1727204719.45205: stdout chunk (state=3): >>>import 'site' # <<< 51243 1727204719.45240: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 51243 1727204719.46003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb93fb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdba8140> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 51243 1727204719.46103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.46126: stdout chunk (state=3): >>>import 'itertools' # <<< 51243 1727204719.46129: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 51243 1727204719.46132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbcb9e0> <<< 51243 1727204719.46134: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 51243 1727204719.46137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbcbf80> <<< 51243 1727204719.46139: stdout chunk (state=3): >>>import '_collections' # <<< 51243 1727204719.46490: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbabc80> <<< 51243 1727204719.46497: stdout chunk (state=3): >>>import '_functools' # <<< 51243 1727204719.46499: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdba93a0> <<< 51243 1727204719.46602: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb91160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbef920> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbee540> <<< 51243 1727204719.46607: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbaa390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbecd70> <<< 51243 1727204719.46648: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1c980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb903e0> <<< 51243 1727204719.46701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 51243 1727204719.46709: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.46712: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc1ce30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1cce0> <<< 51243 1727204719.46759: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.46762: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc1d0a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb8ef00> <<< 51243 1727204719.46787: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 51243 1727204719.47082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1d760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1d430> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1e660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 51243 1727204719.47092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 51243 1727204719.47095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 51243 1727204719.47097: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc38890> <<< 51243 1727204719.47173: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc39fd0> <<< 51243 1727204719.47177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 51243 1727204719.47241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc3ae70> <<< 51243 1727204719.47245: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc3b4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc3a3c0> <<< 51243 1727204719.47480: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 51243 1727204719.47498: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc3bef0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc3b620> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1e6c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 51243 1727204719.47517: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd933d70> <<< 51243 1727204719.47544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 51243 1727204719.47578: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.47607: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd95c830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd95c590> <<< 51243 1727204719.47652: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd95c860> <<< 51243 1727204719.47684: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd95ca10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd931f10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 51243 1727204719.47990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd95e0c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd95cd40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1edb0> <<< 51243 1727204719.47996: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 51243 1727204719.47999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.48022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 51243 1727204719.48056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 51243 1727204719.48088: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd986480> <<< 51243 1727204719.48189: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 51243 1727204719.48204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 51243 1727204719.48257: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9a2600> <<< 51243 1727204719.48277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 51243 1727204719.48326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 51243 1727204719.48386: stdout chunk (state=3): >>>import 'ntpath' # <<< 51243 1727204719.48413: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9d7380> <<< 51243 1727204719.48572: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 51243 1727204719.48580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 51243 1727204719.48809: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9fdb20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9d74a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9a3290> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd818470> <<< 51243 1727204719.48826: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9a1640> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd95f020> <<< 51243 1727204719.49113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f24fd9a1a00> <<< 51243 1727204719.49211: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_lcvub_r_/ansible_setup_payload.zip' # zipimport: zlib available <<< 51243 1727204719.49371: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.49392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 51243 1727204719.49441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 51243 1727204719.49456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 51243 1727204719.49668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd886270> import '_typing' # <<< 51243 1727204719.49788: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd85d160> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd85c2c0> <<< 51243 1727204719.49806: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.49831: stdout chunk (state=3): >>>import 'ansible' # <<< 51243 1727204719.49887: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 51243 1727204719.49982: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.51540: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.52895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd85f680> <<< 51243 1727204719.52942: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.53058: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd8b5c70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd8b5a00> <<< 51243 1727204719.53089: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd8b5310> <<< 51243 1727204719.53113: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 51243 1727204719.53131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 51243 1727204719.53169: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd8b5a90> <<< 51243 1727204719.53271: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd886f00> import 'atexit' # <<< 51243 1727204719.53285: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd8b69c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd8b6c00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 51243 1727204719.53334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 51243 1727204719.53354: stdout chunk (state=3): >>>import '_locale' # <<< 51243 1727204719.53394: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd8b70b0> <<< 51243 1727204719.53485: stdout chunk (state=3): >>>import 'pwd' # <<< 51243 1727204719.53488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 51243 1727204719.53520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd71ce00> <<< 51243 1727204719.53533: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd71ea20> <<< 51243 1727204719.53572: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 51243 1727204719.53575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 51243 1727204719.53790: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd71f3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd720590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 51243 1727204719.53834: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd723020> <<< 51243 1727204719.53874: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd723140> <<< 51243 1727204719.53902: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd7212e0> <<< 51243 1727204719.53919: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 51243 1727204719.53960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 51243 1727204719.53980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 51243 1727204719.54005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 51243 1727204719.54089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 51243 1727204719.54093: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 51243 1727204719.54096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd726ea0> <<< 51243 1727204719.54098: stdout chunk (state=3): >>>import '_tokenize' # <<< 51243 1727204719.54191: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd725970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd7256d0> <<< 51243 1727204719.54195: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 51243 1727204719.54216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 51243 1727204719.54290: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd727e90> <<< 51243 1727204719.54431: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd7217f0> <<< 51243 1727204719.54462: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd76af60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd76b050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 51243 1727204719.54512: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd770c50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd770a10> <<< 51243 1727204719.54526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 51243 1727204719.54671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 51243 1727204719.54725: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd773140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd7712e0> <<< 51243 1727204719.54749: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 51243 1727204719.54803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.54841: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 51243 1727204719.54866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 51243 1727204719.54981: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd77a900> <<< 51243 1727204719.55044: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd773290> <<< 51243 1727204719.55135: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77b6e0> <<< 51243 1727204719.55159: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.55204: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77ba10> <<< 51243 1727204719.55332: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77bbc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd76b350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 51243 1727204719.55335: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.55379: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77f350> <<< 51243 1727204719.55547: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.55564: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd780830> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd77daf0> <<< 51243 1727204719.55602: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77eea0> <<< 51243 1727204719.55671: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd77d790> # zipimport: zlib available <<< 51243 1727204719.55751: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 51243 1727204719.55771: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.55876: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.55914: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 51243 1727204719.55921: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 51243 1727204719.55939: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.56074: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.56270: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.57098: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.57502: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 51243 1727204719.57529: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 51243 1727204719.57553: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.57613: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.57638: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd608950> <<< 51243 1727204719.57702: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 51243 1727204719.57736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6096d0> <<< 51243 1727204719.57855: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd77e630> <<< 51243 1727204719.57876: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 51243 1727204719.58026: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.58203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 51243 1727204719.58224: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd609730> # zipimport: zlib available <<< 51243 1727204719.58788: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.59297: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.59413: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.59451: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 51243 1727204719.59474: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.59498: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.59556: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 51243 1727204719.59590: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.59632: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.59817: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 51243 1727204719.59839: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.59862: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 51243 1727204719.60127: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.60394: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 51243 1727204719.60478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 51243 1727204719.60491: stdout chunk (state=3): >>>import '_ast' # <<< 51243 1727204719.60564: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd60a720> <<< 51243 1727204719.60584: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.60693: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.60739: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 51243 1727204719.60793: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 51243 1727204719.60805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 51243 1727204719.60914: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.61016: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd612420> <<< 51243 1727204719.61074: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd612d80> <<< 51243 1727204719.61150: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd60b740> # zipimport: zlib available <<< 51243 1727204719.61162: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.61197: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 51243 1727204719.61243: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.61264: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.61343: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.61363: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.61465: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 51243 1727204719.61488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.61592: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd611b20> <<< 51243 1727204719.61658: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd612ff0> <<< 51243 1727204719.61714: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 51243 1727204719.62094: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62098: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204719.62111: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 51243 1727204719.62149: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a7170> <<< 51243 1727204719.62194: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd61ffe0> <<< 51243 1727204719.62287: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd61de80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd61aea0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 51243 1727204719.62307: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62331: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62353: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 51243 1727204719.62440: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 51243 1727204719.62464: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 51243 1727204719.62481: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62548: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62617: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62654: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62657: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62705: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62756: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62791: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62840: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 51243 1727204719.62843: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.62979: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.63014: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.63033: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.63172: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 51243 1727204719.63287: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.63486: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.63542: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.63629: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 51243 1727204719.63653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 51243 1727204719.63745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a9e20> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 51243 1727204719.63776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 51243 1727204719.63780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 51243 1727204719.63863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 51243 1727204719.63875: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcb445f0> <<< 51243 1727204719.63967: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.63971: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcb449e0> <<< 51243 1727204719.63991: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd68d640> <<< 51243 1727204719.64010: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd68ca10> <<< 51243 1727204719.64146: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a8500> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a8890> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 51243 1727204719.64161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 51243 1727204719.64192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 51243 1727204719.64219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 51243 1727204719.64271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 51243 1727204719.64298: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcb478f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcb471d0> <<< 51243 1727204719.64406: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcb47380> <<< 51243 1727204719.64409: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcb46600> <<< 51243 1727204719.64411: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 51243 1727204719.64498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 51243 1727204719.64505: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcb479e0> <<< 51243 1727204719.64520: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 51243 1727204719.64726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcbae4e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbac500> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a95e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.64729: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 51243 1727204719.64733: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.64800: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.64868: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 51243 1727204719.64888: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.64940: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.64990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 51243 1727204719.65026: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.65067: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 51243 1727204719.65088: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.65134: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 51243 1727204719.65168: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.65222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 51243 1727204719.65245: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.65282: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.65385: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 51243 1727204719.65455: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.65469: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.65536: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.65581: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 51243 1727204719.65618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 51243 1727204719.66240: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.66758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.66802: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.66852: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.66877: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 51243 1727204719.66972: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 51243 1727204719.67209: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.67230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 51243 1727204719.67244: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.67331: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.67529: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbafec0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 51243 1727204719.67545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 51243 1727204719.67662: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbaf470> import 'ansible.module_utils.facts.system.local' # <<< 51243 1727204719.67680: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.67749: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.67922: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 51243 1727204719.67946: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.68050: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 51243 1727204719.68158: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.68191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 51243 1727204719.68205: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.68302: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.68305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 51243 1727204719.68391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 51243 1727204719.68484: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204719.68525: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcbe28a0> <<< 51243 1727204719.68904: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbca630> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # <<< 51243 1727204719.68909: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.68986: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.69081: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.69207: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.69587: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 51243 1727204719.69675: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcbfe150> <<< 51243 1727204719.69688: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbfdd00> import 'ansible.module_utils.facts.system.user' # <<< 51243 1727204719.69691: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.69832: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.69838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 51243 1727204719.69840: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.69925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 51243 1727204719.70030: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.70229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 51243 1727204719.70276: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.70500: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # <<< 51243 1727204719.70507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 51243 1727204719.70509: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.70584: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.70886: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 51243 1727204719.71020: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.71408: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 51243 1727204719.71412: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.71899: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.72502: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 51243 1727204719.72517: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.72621: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.72740: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 51243 1727204719.72754: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.72851: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.72958: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 51243 1727204719.72973: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.73171: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.73319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 51243 1727204719.73378: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 51243 1727204719.73483: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 51243 1727204719.73578: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.73703: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.73925: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.74265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 51243 1727204719.74272: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 51243 1727204719.74288: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.74316: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 51243 1727204719.74319: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.74411: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.74486: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 51243 1727204719.74588: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.74603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.74658: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 51243 1727204719.74676: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.74794: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 51243 1727204719.75112: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.75410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 51243 1727204719.75781: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.75835: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 51243 1727204719.75925: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 51243 1727204719.76057: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.76203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 51243 1727204719.76207: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.76245: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 51243 1727204719.76295: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.76389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204719.76422: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.76510: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.76593: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.76691: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.76838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 51243 1727204719.76930: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.76979: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 51243 1727204719.77013: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.77437: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.77877: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 51243 1727204719.77881: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.77943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 51243 1727204719.78013: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.78084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 51243 1727204719.78307: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.78380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 51243 1727204719.78472: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.78597: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 51243 1727204719.78634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 51243 1727204719.78677: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204719.79096: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 51243 1727204719.79211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fca26d50> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fca25af0> <<< 51243 1727204719.79330: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fca23170> <<< 51243 1727204719.80430: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "19", "epoch": "1727204719", "epoch_int": "1727204719", "date": "2024-09-24", "time": "15:05:19", "iso8601_micro": "2024-09-24T19:05:19.792861Z", "iso8601": "2024-09-24T19:05:19Z", "iso8601_basic": "20240924T150519792861", "iso8601_basic_short": "20240924T150519", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 51243 1727204719.81458: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder #<<< 51243 1727204719.81513: stdout chunk (state=3): >>> cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 51243 1727204719.81956: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 51243 1727204719.82029: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 51243 1727204719.82049: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 51243 1727204719.82062: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 51243 1727204719.82088: stdout chunk (state=3): >>># destroy ntpath <<< 51243 1727204719.82132: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 51243 1727204719.82150: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 51243 1727204719.82177: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale <<< 51243 1727204719.82197: stdout chunk (state=3): >>># destroy locale # destroy select <<< 51243 1727204719.82210: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 51243 1727204719.82288: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 51243 1727204719.82381: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 51243 1727204719.82410: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 51243 1727204719.82447: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 51243 1727204719.82450: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 51243 1727204719.82472: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 51243 1727204719.82496: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 51243 1727204719.82545: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 51243 1727204719.82580: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct <<< 51243 1727204719.82592: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 51243 1727204719.82663: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 51243 1727204719.82694: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 51243 1727204719.82738: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 51243 1727204719.82769: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 51243 1727204719.82842: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools <<< 51243 1727204719.82886: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 51243 1727204719.82900: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 51243 1727204719.83221: stdout chunk (state=3): >>># destroy sys.monitoring <<< 51243 1727204719.83264: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 51243 1727204719.83298: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 51243 1727204719.83346: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 51243 1727204719.83395: stdout chunk (state=3): >>># destroy _typing <<< 51243 1727204719.83415: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 51243 1727204719.83446: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 51243 1727204719.83498: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 51243 1727204719.83600: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 51243 1727204719.83653: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 51243 1727204719.83777: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 51243 1727204719.84469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 51243 1727204719.84474: stdout chunk (state=3): >>><<< 51243 1727204719.84477: stderr chunk (state=3): >>><<< 51243 1727204719.84886: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdda4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdd73b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdda6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb551c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb560c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb93fb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdba8140> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbcb9e0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbcbf80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbabc80> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdba93a0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb91160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbef920> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbee540> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbaa390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdbecd70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1c980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb903e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc1ce30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1cce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc1d0a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdb8ef00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1d760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1d430> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1e660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc38890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc39fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc3ae70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc3b4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc3a3c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fdc3bef0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc3b620> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1e6c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd933d70> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd95c830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd95c590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd95c860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd95ca10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd931f10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd95e0c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd95cd40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fdc1edb0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd986480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9a2600> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9d7380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9fdb20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9d74a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9a3290> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd818470> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd9a1640> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd95f020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f24fd9a1a00> # zipimport: found 103 names in '/tmp/ansible_setup_payload_lcvub_r_/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd886270> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd85d160> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd85c2c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd85f680> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd8b5c70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd8b5a00> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd8b5310> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd8b5a90> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd886f00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd8b69c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd8b6c00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd8b70b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd71ce00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd71ea20> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd71f3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd720590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd723020> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd723140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd7212e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd726ea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd725970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd7256d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd727e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd7217f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd76af60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd76b050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd770c50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd770a10> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd773140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd7712e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd77a900> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd773290> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77b6e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77ba10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77bbc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd76b350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77f350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd780830> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd77daf0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd77eea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd77d790> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd608950> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6096d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd77e630> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd609730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd60a720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd612420> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd612d80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd60b740> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fd611b20> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd612ff0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a7170> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd61ffe0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd61de80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd61aea0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a9e20> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcb445f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcb449e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd68d640> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd68ca10> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a8500> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a8890> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcb478f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcb471d0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcb47380> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcb46600> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcb479e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcbae4e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbac500> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fd6a95e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbafec0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbaf470> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcbe28a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbca630> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fcbfe150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fcbfdd00> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24fca26d50> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fca25af0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24fca23170> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "05", "second": "19", "epoch": "1727204719", "epoch_int": "1727204719", "date": "2024-09-24", "time": "15:05:19", "iso8601_micro": "2024-09-24T19:05:19.792861Z", "iso8601": "2024-09-24T19:05:19Z", "iso8601_basic": "20240924T150519792861", "iso8601_basic_short": "20240924T150519", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 51243 1727204719.86895: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51243 1727204719.86899: _low_level_execute_command(): starting 51243 1727204719.86901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204719.213042-51359-100948063728716/ > /dev/null 2>&1 && sleep 0' 51243 1727204719.86972: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51243 1727204719.87068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204719.87072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204719.87132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204719.87161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204719.87185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204719.87286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 51243 1727204719.90181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204719.90186: stdout chunk (state=3): >>><<< 51243 1727204719.90189: stderr chunk (state=3): >>><<< 51243 1727204719.90250: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 51243 1727204719.90254: handler run complete 51243 1727204719.90300: variable 'ansible_facts' from source: unknown 51243 1727204719.90396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.90554: variable 'ansible_facts' from source: unknown 51243 1727204719.90718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.90721: attempt loop complete, returning result 51243 1727204719.90724: _execute() done 51243 1727204719.90726: dumping result to json 51243 1727204719.90728: done dumping result, returning 51243 1727204719.90730: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-5c5d-847b-000000000158] 51243 1727204719.90748: sending task result for task 127b8e07-fff9-5c5d-847b-000000000158 ok: [managed-node3] 51243 1727204719.91339: no more pending results, returning what we have 51243 1727204719.91342: results queue empty 51243 1727204719.91343: checking for any_errors_fatal 51243 1727204719.91345: done checking for any_errors_fatal 51243 1727204719.91345: checking for max_fail_percentage 51243 1727204719.91347: done checking for max_fail_percentage 51243 1727204719.91348: checking to see if all hosts have failed and the running result is not ok 51243 1727204719.91349: done checking to see if all hosts have failed 51243 1727204719.91350: getting the remaining hosts for this loop 51243 1727204719.91351: done getting the remaining hosts for this loop 51243 1727204719.91355: getting the next task for host managed-node3 51243 1727204719.91365: done getting next task for host managed-node3 51243 1727204719.91472: ^ task is: TASK: Check if system is ostree 51243 1727204719.91475: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204719.91478: getting variables 51243 1727204719.91480: in VariableManager get_vars() 51243 1727204719.91513: Calling all_inventory to load vars for managed-node3 51243 1727204719.91516: Calling groups_inventory to load vars for managed-node3 51243 1727204719.91520: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204719.91528: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000158 51243 1727204719.91531: WORKER PROCESS EXITING 51243 1727204719.91544: Calling all_plugins_play to load vars for managed-node3 51243 1727204719.91547: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204719.91551: Calling groups_plugins_play to load vars for managed-node3 51243 1727204719.91857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204719.92135: done with get_vars() 51243 1727204719.92157: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:05:19 -0400 (0:00:00.796) 0:00:02.530 ***** 51243 1727204719.92269: entering _queue_task() for managed-node3/stat 51243 1727204719.92747: worker is 1 (out of 1 available) 51243 1727204719.92759: exiting _queue_task() for managed-node3/stat 51243 1727204719.92772: done queuing things up, now waiting for results queue to drain 51243 1727204719.92774: waiting for pending results... 51243 1727204719.93005: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 51243 1727204719.93125: in run() - task 127b8e07-fff9-5c5d-847b-00000000015a 51243 1727204719.93221: variable 'ansible_search_path' from source: unknown 51243 1727204719.93226: variable 'ansible_search_path' from source: unknown 51243 1727204719.93229: calling self._execute() 51243 1727204719.93300: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204719.93314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204719.93347: variable 'omit' from source: magic vars 51243 1727204719.93967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51243 1727204719.94358: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51243 1727204719.94442: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51243 1727204719.94482: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51243 1727204719.94542: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51243 1727204719.94872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51243 1727204719.94876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51243 1727204719.94880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51243 1727204719.94882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51243 1727204719.94923: Evaluated conditional (not __network_is_ostree is defined): True 51243 1727204719.94940: variable 'omit' from source: magic vars 51243 1727204719.95005: variable 'omit' from source: magic vars 51243 1727204719.95058: variable 'omit' from source: magic vars 51243 1727204719.95094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51243 1727204719.95147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51243 1727204719.95175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51243 1727204719.95197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204719.95229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204719.95272: variable 'inventory_hostname' from source: host vars for 'managed-node3' 51243 1727204719.95330: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204719.95341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204719.95421: Set connection var ansible_shell_type to sh 51243 1727204719.95459: Set connection var ansible_module_compression to ZIP_DEFLATED 51243 1727204719.95469: Set connection var ansible_connection to ssh 51243 1727204719.95483: Set connection var ansible_pipelining to False 51243 1727204719.95493: Set connection var ansible_shell_executable to /bin/sh 51243 1727204719.95503: Set connection var ansible_timeout to 10 51243 1727204719.95535: variable 'ansible_shell_executable' from source: unknown 51243 1727204719.95573: variable 'ansible_connection' from source: unknown 51243 1727204719.95577: variable 'ansible_module_compression' from source: unknown 51243 1727204719.95579: variable 'ansible_shell_type' from source: unknown 51243 1727204719.95581: variable 'ansible_shell_executable' from source: unknown 51243 1727204719.95655: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204719.95660: variable 'ansible_pipelining' from source: unknown 51243 1727204719.95662: variable 'ansible_timeout' from source: unknown 51243 1727204719.95664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204719.95810: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 51243 1727204719.95826: variable 'omit' from source: magic vars 51243 1727204719.95838: starting attempt loop 51243 1727204719.95870: running the handler 51243 1727204719.95873: _low_level_execute_command(): starting 51243 1727204719.95882: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 51243 1727204719.96893: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204719.96928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204719.96957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204719.97079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 51243 1727204719.99698: stdout chunk (state=3): >>>/root <<< 51243 1727204719.99991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204719.99995: stdout chunk (state=3): >>><<< 51243 1727204719.99997: stderr chunk (state=3): >>><<< 51243 1727204720.00131: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 51243 1727204720.00146: _low_level_execute_command(): starting 51243 1727204720.00149: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103 `" && echo ansible-tmp-1727204720.0002704-51395-91131859143103="` echo /root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103 `" ) && sleep 0' 51243 1727204720.00736: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204720.00759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204720.00891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 51243 1727204720.03891: stdout chunk (state=3): >>>ansible-tmp-1727204720.0002704-51395-91131859143103=/root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103 <<< 51243 1727204720.04151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204720.04155: stdout chunk (state=3): >>><<< 51243 1727204720.04158: stderr chunk (state=3): >>><<< 51243 1727204720.04373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204720.0002704-51395-91131859143103=/root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 51243 1727204720.04377: variable 'ansible_module_compression' from source: unknown 51243 1727204720.04380: ANSIBALLZ: Using lock for stat 51243 1727204720.04382: ANSIBALLZ: Acquiring lock 51243 1727204720.04384: ANSIBALLZ: Lock acquired: 139884892116496 51243 1727204720.04387: ANSIBALLZ: Creating module 51243 1727204720.17154: ANSIBALLZ: Writing module into payload 51243 1727204720.17232: ANSIBALLZ: Writing module 51243 1727204720.17251: ANSIBALLZ: Renaming module 51243 1727204720.17273: ANSIBALLZ: Done creating module 51243 1727204720.17291: variable 'ansible_facts' from source: unknown 51243 1727204720.17342: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/AnsiballZ_stat.py 51243 1727204720.17461: Sending initial data 51243 1727204720.17467: Sent initial data (152 bytes) 51243 1727204720.17979: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51243 1727204720.17982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204720.17985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 51243 1727204720.17987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 51243 1727204720.17989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204720.18055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204720.18058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204720.18134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 51243 1727204720.20611: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 51243 1727204720.20682: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 51243 1727204720.20758: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-51243vpkpdts3/tmpciasjj5p /root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/AnsiballZ_stat.py <<< 51243 1727204720.20767: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/AnsiballZ_stat.py" <<< 51243 1727204720.20836: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-51243vpkpdts3/tmpciasjj5p" to remote "/root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/AnsiballZ_stat.py" <<< 51243 1727204720.20840: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/AnsiballZ_stat.py" <<< 51243 1727204720.21640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204720.21709: stderr chunk (state=3): >>><<< 51243 1727204720.21715: stdout chunk (state=3): >>><<< 51243 1727204720.21742: done transferring module to remote 51243 1727204720.21754: _low_level_execute_command(): starting 51243 1727204720.21759: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/ /root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/AnsiballZ_stat.py && sleep 0' 51243 1727204720.22270: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204720.22279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204720.22283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204720.22342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204720.22346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204720.22352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204720.22427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 51243 1727204720.25275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204720.25335: stderr chunk (state=3): >>><<< 51243 1727204720.25339: stdout chunk (state=3): >>><<< 51243 1727204720.25356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 51243 1727204720.25359: _low_level_execute_command(): starting 51243 1727204720.25364: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/AnsiballZ_stat.py && sleep 0' 51243 1727204720.25877: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 51243 1727204720.25881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204720.25884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204720.25898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 51243 1727204720.25942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 51243 1727204720.25945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204720.25948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204720.26044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 51243 1727204720.29659: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 51243 1727204720.29709: stdout chunk (state=3): >>>import _imp # builtin <<< 51243 1727204720.29762: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 51243 1727204720.29768: stdout chunk (state=3): >>>import '_weakref' # <<< 51243 1727204720.29873: stdout chunk (state=3): >>>import '_io' # <<< 51243 1727204720.29897: stdout chunk (state=3): >>>import 'marshal' # <<< 51243 1727204720.29954: stdout chunk (state=3): >>>import 'posix' # <<< 51243 1727204720.30212: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 51243 1727204720.30240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 51243 1727204720.30267: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78344b8530> <<< 51243 1727204720.30284: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834487b30> <<< 51243 1727204720.30327: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 51243 1727204720.30331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 51243 1727204720.30356: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78344baab0> <<< 51243 1727204720.30391: stdout chunk (state=3): >>>import '_signal' # <<< 51243 1727204720.30439: stdout chunk (state=3): >>>import '_abc' # <<< 51243 1727204720.30446: stdout chunk (state=3): >>>import 'abc' # <<< 51243 1727204720.30483: stdout chunk (state=3): >>>import 'io' # <<< 51243 1727204720.30615: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 51243 1727204720.30683: stdout chunk (state=3): >>>import '_collections_abc' # <<< 51243 1727204720.30735: stdout chunk (state=3): >>>import 'genericpath' # <<< 51243 1727204720.30747: stdout chunk (state=3): >>>import 'posixpath' # <<< 51243 1727204720.30790: stdout chunk (state=3): >>>import 'os' # <<< 51243 1727204720.30823: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 51243 1727204720.30844: stdout chunk (state=3): >>>Processing user site-packages <<< 51243 1727204720.30864: stdout chunk (state=3): >>>Processing global site-packages <<< 51243 1727204720.30883: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 51243 1727204720.30893: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 51243 1727204720.31004: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783426d190> <<< 51243 1727204720.31081: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 51243 1727204720.31098: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.31128: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783426e090> <<< 51243 1727204720.31172: stdout chunk (state=3): >>>import 'site' # <<< 51243 1727204720.31223: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 51243 1727204720.31636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 51243 1727204720.31661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 51243 1727204720.31689: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 51243 1727204720.31808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 51243 1727204720.31822: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 51243 1727204720.31863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 51243 1727204720.31890: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342abf50> <<< 51243 1727204720.31923: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 51243 1727204720.31949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 51243 1727204720.31993: stdout chunk (state=3): >>>import '_operator' # <<< 51243 1727204720.32004: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342c00e0> <<< 51243 1727204720.32039: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 51243 1727204720.32075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 51243 1727204720.32120: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 51243 1727204720.32303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342e3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342e3fe0> import '_collections' # <<< 51243 1727204720.32372: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342c3c20> <<< 51243 1727204720.32391: stdout chunk (state=3): >>>import '_functools' # <<< 51243 1727204720.32449: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342c1340> <<< 51243 1727204720.32612: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342a9100> <<< 51243 1727204720.32656: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 51243 1727204720.32686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 51243 1727204720.32719: stdout chunk (state=3): >>>import '_sre' # <<< 51243 1727204720.32747: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 51243 1727204720.32794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 51243 1727204720.32821: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 51243 1727204720.32846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 51243 1727204720.33114: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78343078c0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78343064e0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342c21e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834304d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834334950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342a8380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834334e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834334cb0> <<< 51243 1727204720.33149: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.33174: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.33179: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834335070> <<< 51243 1727204720.33207: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342a6ea0> <<< 51243 1727204720.33249: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 51243 1727204720.33263: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.33296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 51243 1727204720.33350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 51243 1727204720.33407: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834335730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834335400> import 'importlib.machinery' # <<< 51243 1727204720.33429: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 51243 1727204720.33434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 51243 1727204720.33467: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834336600> <<< 51243 1727204720.33489: stdout chunk (state=3): >>>import 'importlib.util' # <<< 51243 1727204720.33519: stdout chunk (state=3): >>>import 'runpy' # <<< 51243 1727204720.33551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 51243 1727204720.33614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 51243 1727204720.33641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 51243 1727204720.33670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 51243 1727204720.33673: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834354830> <<< 51243 1727204720.33704: stdout chunk (state=3): >>>import 'errno' # <<< 51243 1727204720.33908: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834355ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834356d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78343573b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78343562d0> <<< 51243 1727204720.33923: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 51243 1727204720.33945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 51243 1727204720.34003: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.34022: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.34025: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834357e30> <<< 51243 1727204720.34057: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834357560> <<< 51243 1727204720.34137: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834336660> <<< 51243 1727204720.34181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 51243 1727204720.34258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 51243 1727204720.34262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 51243 1727204720.34269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 51243 1727204720.34365: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834113d70> <<< 51243 1727204720.34368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 51243 1727204720.34376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 51243 1727204720.34413: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.34421: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.34435: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f783413c860> <<< 51243 1727204720.34611: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783413c5c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f783413c890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f783413ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834111f10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 51243 1727204720.34739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 51243 1727204720.34761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 51243 1727204720.34790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 51243 1727204720.34808: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783413e0f0> <<< 51243 1727204720.34849: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783413cd70> <<< 51243 1727204720.34878: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834336d50> <<< 51243 1727204720.35006: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.35030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 51243 1727204720.35115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 51243 1727204720.35160: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834166480> <<< 51243 1727204720.35239: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 51243 1727204720.35263: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.35300: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 51243 1727204720.35338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 51243 1727204720.35421: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834182630> <<< 51243 1727204720.35452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 51243 1727204720.35526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 51243 1727204720.35805: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78341b73e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 51243 1727204720.35831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 51243 1727204720.35980: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78341e1b80> <<< 51243 1727204720.36115: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78341b7500> <<< 51243 1727204720.36195: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78341832c0> <<< 51243 1727204720.36313: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 51243 1727204720.36316: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833fbc500> <<< 51243 1727204720.36319: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834181670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783413f020> <<< 51243 1727204720.36479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 51243 1727204720.36517: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7834181400> <<< 51243 1727204720.36660: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_duuibjj4/ansible_stat_payload.zip' <<< 51243 1727204720.36663: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.36916: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.36958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 51243 1727204720.36982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 51243 1727204720.37053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 51243 1727204720.37412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78340161e0> import '_typing' # <<< 51243 1727204720.37563: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833fed0d0> <<< 51243 1727204720.37589: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833fec230> # zipimport: zlib available <<< 51243 1727204720.37626: stdout chunk (state=3): >>>import 'ansible' # <<< 51243 1727204720.37666: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.37690: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.37716: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.37727: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 51243 1727204720.37758: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.40379: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.42748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 51243 1727204720.42795: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833fef5f0> <<< 51243 1727204720.42800: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 51243 1727204720.42825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.42849: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 51243 1727204720.42884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 51243 1727204720.42929: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 51243 1727204720.42980: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.42984: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834041b50> <<< 51243 1727204720.43052: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78340418e0> <<< 51243 1727204720.43102: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834041220> <<< 51243 1727204720.43214: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834041640> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834016e70> import 'atexit' # <<< 51243 1727204720.43249: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.43270: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834042900> <<< 51243 1727204720.43304: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.43319: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834042b40> <<< 51243 1727204720.43347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 51243 1727204720.43444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 51243 1727204720.43611: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834042ff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 51243 1727204720.43672: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea4e00> <<< 51243 1727204720.43725: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.43729: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.43770: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833ea6a20> <<< 51243 1727204720.43774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 51243 1727204720.43809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 51243 1727204720.43883: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea73e0> <<< 51243 1727204720.43906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 51243 1727204720.43961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 51243 1727204720.43991: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea85c0> <<< 51243 1727204720.44034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 51243 1727204720.44097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 51243 1727204720.44131: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 51243 1727204720.44310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eab080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833eab1a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea9340> <<< 51243 1727204720.44342: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 51243 1727204720.44388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 51243 1727204720.44426: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 51243 1727204720.44447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 51243 1727204720.44477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 51243 1727204720.44518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 51243 1727204720.44568: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 51243 1727204720.44601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 51243 1727204720.44604: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eaf050> <<< 51243 1727204720.44625: stdout chunk (state=3): >>>import '_tokenize' # <<< 51243 1727204720.44751: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eadb20> <<< 51243 1727204720.44766: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ead880> <<< 51243 1727204720.44796: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 51243 1727204720.44819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 51243 1727204720.45122: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eafec0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea9850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833ef70b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ef7230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 51243 1727204720.45210: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833efce00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833efcbf0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 51243 1727204720.45382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 51243 1727204720.45472: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.45492: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833eff260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833efd460> <<< 51243 1727204720.45624: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.45637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 51243 1727204720.45656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 51243 1727204720.45673: stdout chunk (state=3): >>>import '_string' # <<< 51243 1727204720.45751: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f02a80> <<< 51243 1727204720.45987: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eff410> <<< 51243 1727204720.46104: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.46118: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f03920> <<< 51243 1727204720.46311: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f03770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f03d70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ef7530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 51243 1727204720.46341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 51243 1727204720.46377: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.46434: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.46453: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f07470> <<< 51243 1727204720.46747: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.46750: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.46785: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f08620> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f05be0> <<< 51243 1727204720.46841: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.46863: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f06f90> <<< 51243 1727204720.46877: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f05820> <<< 51243 1727204720.46907: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.46946: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.46949: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 51243 1727204720.47455: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 51243 1727204720.47643: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.47844: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.48974: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.50195: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 51243 1727204720.50239: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 51243 1727204720.50256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.50350: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.50412: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f90860> <<< 51243 1727204720.50900: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f917f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f0b0e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204720.50906: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 51243 1727204720.51084: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.51378: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 51243 1727204720.51392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 51243 1727204720.51398: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f91ca0> <<< 51243 1727204720.51428: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.52362: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.53274: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.53457: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.53538: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 51243 1727204720.53564: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.53625: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.53678: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 51243 1727204720.53708: stdout chunk (state=3): >>> # zipimport: zlib available <<< 51243 1727204720.53839: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.54126: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 51243 1727204720.54169: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 51243 1727204720.54172: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.54615: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.55076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 51243 1727204720.55198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 51243 1727204720.55231: stdout chunk (state=3): >>>import '_ast' # <<< 51243 1727204720.55327: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f926c0> <<< 51243 1727204720.55377: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.55506: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.55641: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 51243 1727204720.55644: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 51243 1727204720.55697: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 51243 1727204720.55992: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.56029: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833d9e030> <<< 51243 1727204720.56114: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.56137: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833d9e990> <<< 51243 1727204720.56152: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f935f0> <<< 51243 1727204720.56184: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.56260: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.56331: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 51243 1727204720.56348: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.56431: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.56492: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.56593: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.56721: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 51243 1727204720.56804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.56973: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.56998: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 51243 1727204720.57001: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833d9d7f0> <<< 51243 1727204720.57056: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833d9eb70> <<< 51243 1727204720.57118: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 51243 1727204720.57127: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 51243 1727204720.57213: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.57258: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.57370: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.57425: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.57500: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 51243 1727204720.57503: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 51243 1727204720.57547: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 51243 1727204720.57619: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 51243 1727204720.57622: stdout chunk (state=3): >>> <<< 51243 1727204720.57852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 51243 1727204720.58023: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833e2ecc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833da8a10> <<< 51243 1727204720.58136: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833da6ba0> <<< 51243 1727204720.58156: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833da69f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 51243 1727204720.58180: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.58236: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.58273: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 51243 1727204720.58314: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 51243 1727204720.58700: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 51243 1727204720.58808: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.59144: stdout chunk (state=3): >>># zipimport: zlib available <<< 51243 1727204720.59349: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 51243 1727204720.59411: stdout chunk (state=3): >>># destroy __main__ <<< 51243 1727204720.60004: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 51243 1727204720.60052: stdout chunk (state=3): >>> # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time<<< 51243 1727204720.60097: stdout chunk (state=3): >>> # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site<<< 51243 1727204720.60126: stdout chunk (state=3): >>> # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools <<< 51243 1727204720.60140: stdout chunk (state=3): >>># cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap<<< 51243 1727204720.60211: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression<<< 51243 1727204720.60246: stdout chunk (state=3): >>> # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil<<< 51243 1727204720.60271: stdout chunk (state=3): >>> # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder<<< 51243 1727204720.60314: stdout chunk (state=3): >>> # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd <<< 51243 1727204720.60317: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token<<< 51243 1727204720.60355: stdout chunk (state=3): >>> # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 51243 1727204720.60359: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal<<< 51243 1727204720.60397: stdout chunk (state=3): >>> # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common <<< 51243 1727204720.60401: stdout chunk (state=3): >>># destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian<<< 51243 1727204720.60440: stdout chunk (state=3): >>> # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy <<< 51243 1727204720.60444: stdout chunk (state=3): >>># destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 51243 1727204720.60490: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2<<< 51243 1727204720.60494: stdout chunk (state=3): >>> # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file<<< 51243 1727204720.60512: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils<<< 51243 1727204720.60617: stdout chunk (state=3): >>> # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 51243 1727204720.61205: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch<<< 51243 1727204720.61303: stdout chunk (state=3): >>> # destroy ipaddress # destroy ntpath <<< 51243 1727204720.61307: stdout chunk (state=3): >>># destroy importlib <<< 51243 1727204720.61309: stdout chunk (state=3): >>># destroy zipimport<<< 51243 1727204720.61346: stdout chunk (state=3): >>> # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon<<< 51243 1727204720.61349: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder<<< 51243 1727204720.61388: stdout chunk (state=3): >>> # destroy json.scanner # destroy _json # destroy grp<<< 51243 1727204720.61392: stdout chunk (state=3): >>> # destroy encodings # destroy _locale<<< 51243 1727204720.61428: stdout chunk (state=3): >>> # destroy pwd # destroy locale<<< 51243 1727204720.61439: stdout chunk (state=3): >>> # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog <<< 51243 1727204720.61475: stdout chunk (state=3): >>># destroy uuid # destroy selectors <<< 51243 1727204720.61499: stdout chunk (state=3): >>># destroy errno # destroy array<<< 51243 1727204720.61516: stdout chunk (state=3): >>> <<< 51243 1727204720.61549: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib<<< 51243 1727204720.61563: stdout chunk (state=3): >>> # destroy _blake2 # destroy selinux<<< 51243 1727204720.61599: stdout chunk (state=3): >>> # destroy shutil # destroy distro<<< 51243 1727204720.61614: stdout chunk (state=3): >>> # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex<<< 51243 1727204720.61699: stdout chunk (state=3): >>> # destroy subprocess # cleanup[3] wiping selinux._selinux<<< 51243 1727204720.61736: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian <<< 51243 1727204720.61773: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 51243 1727204720.61776: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves<<< 51243 1727204720.61811: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 51243 1727204720.61823: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime<<< 51243 1727204720.61858: stdout chunk (state=3): >>> # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize<<< 51243 1727204720.61870: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 51243 1727204720.61916: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 51243 1727204720.61920: stdout chunk (state=3): >>> # cleanup[3] wiping _sha2 # cleanup[3] wiping _random<<< 51243 1727204720.61955: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 51243 1727204720.61970: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 51243 1727204720.62218: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 51243 1727204720.62328: stdout chunk (state=3): >>># destroy sys.monitoring <<< 51243 1727204720.62343: stdout chunk (state=3): >>># destroy _socket <<< 51243 1727204720.62376: stdout chunk (state=3): >>># destroy _collections<<< 51243 1727204720.62399: stdout chunk (state=3): >>> <<< 51243 1727204720.62457: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 51243 1727204720.62460: stdout chunk (state=3): >>># destroy stat<<< 51243 1727204720.62491: stdout chunk (state=3): >>> # destroy genericpath <<< 51243 1727204720.62497: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize<<< 51243 1727204720.62538: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib <<< 51243 1727204720.62548: stdout chunk (state=3): >>># destroy copyreg <<< 51243 1727204720.62591: stdout chunk (state=3): >>># destroy contextlib # destroy _typing <<< 51243 1727204720.62623: stdout chunk (state=3): >>># destroy _tokenize <<< 51243 1727204720.62649: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools <<< 51243 1727204720.62667: stdout chunk (state=3): >>># destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 51243 1727204720.62718: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path<<< 51243 1727204720.62731: stdout chunk (state=3): >>> # clear sys.modules <<< 51243 1727204720.62803: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 51243 1727204720.62904: stdout chunk (state=3): >>># destroy codecs<<< 51243 1727204720.62932: stdout chunk (state=3): >>> # destroy encodings.aliases <<< 51243 1727204720.62962: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 51243 1727204720.62984: stdout chunk (state=3): >>># destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref<<< 51243 1727204720.63020: stdout chunk (state=3): >>> # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 51243 1727204720.63024: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time<<< 51243 1727204720.63083: stdout chunk (state=3): >>> # destroy _random <<< 51243 1727204720.63086: stdout chunk (state=3): >>># destroy _weakref <<< 51243 1727204720.63126: stdout chunk (state=3): >>># destroy _operator # destroy _sha2<<< 51243 1727204720.63161: stdout chunk (state=3): >>> # destroy _string # destroy re # destroy itertools<<< 51243 1727204720.63187: stdout chunk (state=3): >>> <<< 51243 1727204720.63202: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools<<< 51243 1727204720.63231: stdout chunk (state=3): >>> # destroy builtins # destroy _thread # clear sys.audit hooks <<< 51243 1727204720.63904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 51243 1727204720.64010: stderr chunk (state=3): >>><<< 51243 1727204720.64013: stdout chunk (state=3): >>><<< 51243 1727204720.64185: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78344b8530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834487b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78344baab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783426d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783426e090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342abf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342c00e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342e3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342e3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342c3c20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342c1340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342a9100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78343078c0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78343064e0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342c21e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834304d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834334950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342a8380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834334e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834334cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834335070> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78342a6ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834335730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834335400> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834336600> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834354830> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834355ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834356d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78343573b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78343562d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834357e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834357560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834336660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834113d70> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f783413c860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783413c5c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f783413c890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f783413ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834111f10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783413e0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783413cd70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834336d50> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834166480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834182630> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78341b73e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78341e1b80> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78341b7500> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78341832c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833fbc500> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834181670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f783413f020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7834181400> # zipimport: found 30 names in '/tmp/ansible_stat_payload_duuibjj4/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78340161e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833fed0d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833fec230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833fef5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834041b50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78340418e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834041220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834041640> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834016e70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834042900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7834042b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7834042ff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea4e00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833ea6a20> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea73e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea85c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eab080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833eab1a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea9340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eaf050> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eadb20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ead880> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eafec0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ea9850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833ef70b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ef7230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833efce00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833efcbf0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833eff260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833efd460> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f02a80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833eff410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f03920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f03770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f03d70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833ef7530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f07470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f08620> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f05be0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f06f90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f05820> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833f90860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f917f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f0b0e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f91ca0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f926c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833d9e030> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833d9e990> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833f935f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7833d9d7f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833d9eb70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833e2ecc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833da8a10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833da6ba0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7833da69f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 51243 1727204720.64932: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 51243 1727204720.64935: _low_level_execute_command(): starting 51243 1727204720.64938: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204720.0002704-51395-91131859143103/ > /dev/null 2>&1 && sleep 0' 51243 1727204720.65343: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 51243 1727204720.65365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 51243 1727204720.65386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204720.65404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51243 1727204720.65443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 51243 1727204720.65461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 51243 1727204720.65546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 51243 1727204720.65570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 51243 1727204720.65685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 51243 1727204720.68597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 51243 1727204720.68703: stderr chunk (state=3): >>><<< 51243 1727204720.68707: stdout chunk (state=3): >>><<< 51243 1727204720.68872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 51243 1727204720.68877: handler run complete 51243 1727204720.68880: attempt loop complete, returning result 51243 1727204720.68882: _execute() done 51243 1727204720.68885: dumping result to json 51243 1727204720.68887: done dumping result, returning 51243 1727204720.68889: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [127b8e07-fff9-5c5d-847b-00000000015a] 51243 1727204720.68892: sending task result for task 127b8e07-fff9-5c5d-847b-00000000015a ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 51243 1727204720.69063: no more pending results, returning what we have 51243 1727204720.69068: results queue empty 51243 1727204720.69069: checking for any_errors_fatal 51243 1727204720.69285: done checking for any_errors_fatal 51243 1727204720.69287: checking for max_fail_percentage 51243 1727204720.69289: done checking for max_fail_percentage 51243 1727204720.69290: checking to see if all hosts have failed and the running result is not ok 51243 1727204720.69291: done checking to see if all hosts have failed 51243 1727204720.69292: getting the remaining hosts for this loop 51243 1727204720.69294: done getting the remaining hosts for this loop 51243 1727204720.69299: getting the next task for host managed-node3 51243 1727204720.69306: done getting next task for host managed-node3 51243 1727204720.69309: ^ task is: TASK: Set flag to indicate system is ostree 51243 1727204720.69311: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204720.69315: getting variables 51243 1727204720.69317: in VariableManager get_vars() 51243 1727204720.69354: Calling all_inventory to load vars for managed-node3 51243 1727204720.69357: Calling groups_inventory to load vars for managed-node3 51243 1727204720.69362: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204720.69388: Calling all_plugins_play to load vars for managed-node3 51243 1727204720.69392: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204720.69396: Calling groups_plugins_play to load vars for managed-node3 51243 1727204720.69754: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000015a 51243 1727204720.69759: WORKER PROCESS EXITING 51243 1727204720.69789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204720.70215: done with get_vars() 51243 1727204720.70229: done getting variables 51243 1727204720.70354: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.781) 0:00:03.311 ***** 51243 1727204720.70394: entering _queue_task() for managed-node3/set_fact 51243 1727204720.70400: Creating lock for set_fact 51243 1727204720.70804: worker is 1 (out of 1 available) 51243 1727204720.70972: exiting _queue_task() for managed-node3/set_fact 51243 1727204720.70984: done queuing things up, now waiting for results queue to drain 51243 1727204720.70986: waiting for pending results... 51243 1727204720.71141: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 51243 1727204720.71272: in run() - task 127b8e07-fff9-5c5d-847b-00000000015b 51243 1727204720.71295: variable 'ansible_search_path' from source: unknown 51243 1727204720.71310: variable 'ansible_search_path' from source: unknown 51243 1727204720.71362: calling self._execute() 51243 1727204720.71502: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204720.71514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204720.71538: variable 'omit' from source: magic vars 51243 1727204720.72159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 51243 1727204720.72497: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 51243 1727204720.72572: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 51243 1727204720.72627: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 51243 1727204720.72680: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 51243 1727204720.72791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 51243 1727204720.72826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 51243 1727204720.72875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 51243 1727204720.72905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 51243 1727204720.73064: Evaluated conditional (not __network_is_ostree is defined): True 51243 1727204720.73082: variable 'omit' from source: magic vars 51243 1727204720.73129: variable 'omit' from source: magic vars 51243 1727204720.73292: variable '__ostree_booted_stat' from source: set_fact 51243 1727204720.73359: variable 'omit' from source: magic vars 51243 1727204720.73402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51243 1727204720.73444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51243 1727204720.73492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51243 1727204720.73502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204720.73523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204720.73599: variable 'inventory_hostname' from source: host vars for 'managed-node3' 51243 1727204720.73603: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204720.73611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204720.73750: Set connection var ansible_shell_type to sh 51243 1727204720.73825: Set connection var ansible_module_compression to ZIP_DEFLATED 51243 1727204720.73828: Set connection var ansible_connection to ssh 51243 1727204720.73830: Set connection var ansible_pipelining to False 51243 1727204720.73835: Set connection var ansible_shell_executable to /bin/sh 51243 1727204720.73837: Set connection var ansible_timeout to 10 51243 1727204720.73839: variable 'ansible_shell_executable' from source: unknown 51243 1727204720.73850: variable 'ansible_connection' from source: unknown 51243 1727204720.73857: variable 'ansible_module_compression' from source: unknown 51243 1727204720.73863: variable 'ansible_shell_type' from source: unknown 51243 1727204720.73871: variable 'ansible_shell_executable' from source: unknown 51243 1727204720.73891: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204720.73918: variable 'ansible_pipelining' from source: unknown 51243 1727204720.73932: variable 'ansible_timeout' from source: unknown 51243 1727204720.73956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204720.74154: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51243 1727204720.74161: variable 'omit' from source: magic vars 51243 1727204720.74262: starting attempt loop 51243 1727204720.74268: running the handler 51243 1727204720.74272: handler run complete 51243 1727204720.74275: attempt loop complete, returning result 51243 1727204720.74278: _execute() done 51243 1727204720.74281: dumping result to json 51243 1727204720.74283: done dumping result, returning 51243 1727204720.74286: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [127b8e07-fff9-5c5d-847b-00000000015b] 51243 1727204720.74288: sending task result for task 127b8e07-fff9-5c5d-847b-00000000015b 51243 1727204720.74361: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000015b ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 51243 1727204720.74531: no more pending results, returning what we have 51243 1727204720.74537: results queue empty 51243 1727204720.74538: checking for any_errors_fatal 51243 1727204720.74547: done checking for any_errors_fatal 51243 1727204720.74547: checking for max_fail_percentage 51243 1727204720.74549: done checking for max_fail_percentage 51243 1727204720.74550: checking to see if all hosts have failed and the running result is not ok 51243 1727204720.74551: done checking to see if all hosts have failed 51243 1727204720.74552: getting the remaining hosts for this loop 51243 1727204720.74554: done getting the remaining hosts for this loop 51243 1727204720.74558: getting the next task for host managed-node3 51243 1727204720.74572: done getting next task for host managed-node3 51243 1727204720.74575: ^ task is: TASK: Fix CentOS6 Base repo 51243 1727204720.74578: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204720.74581: getting variables 51243 1727204720.74583: in VariableManager get_vars() 51243 1727204720.74617: Calling all_inventory to load vars for managed-node3 51243 1727204720.74620: Calling groups_inventory to load vars for managed-node3 51243 1727204720.74624: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204720.74638: Calling all_plugins_play to load vars for managed-node3 51243 1727204720.74641: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204720.74645: Calling groups_plugins_play to load vars for managed-node3 51243 1727204720.75111: WORKER PROCESS EXITING 51243 1727204720.75143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204720.75460: done with get_vars() 51243 1727204720.75478: done getting variables 51243 1727204720.75626: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.052) 0:00:03.364 ***** 51243 1727204720.75669: entering _queue_task() for managed-node3/copy 51243 1727204720.76136: worker is 1 (out of 1 available) 51243 1727204720.76150: exiting _queue_task() for managed-node3/copy 51243 1727204720.76162: done queuing things up, now waiting for results queue to drain 51243 1727204720.76163: waiting for pending results... 51243 1727204720.76373: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 51243 1727204720.76529: in run() - task 127b8e07-fff9-5c5d-847b-00000000015d 51243 1727204720.76642: variable 'ansible_search_path' from source: unknown 51243 1727204720.76646: variable 'ansible_search_path' from source: unknown 51243 1727204720.76649: calling self._execute() 51243 1727204720.76723: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204720.76768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204720.76773: variable 'omit' from source: magic vars 51243 1727204720.77486: variable 'ansible_distribution' from source: facts 51243 1727204720.77619: Evaluated conditional (ansible_distribution == 'CentOS'): False 51243 1727204720.77623: when evaluation is False, skipping this task 51243 1727204720.77627: _execute() done 51243 1727204720.77636: dumping result to json 51243 1727204720.77639: done dumping result, returning 51243 1727204720.77647: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [127b8e07-fff9-5c5d-847b-00000000015d] 51243 1727204720.77650: sending task result for task 127b8e07-fff9-5c5d-847b-00000000015d 51243 1727204720.77848: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000015d 51243 1727204720.77852: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 51243 1727204720.77930: no more pending results, returning what we have 51243 1727204720.77936: results queue empty 51243 1727204720.77938: checking for any_errors_fatal 51243 1727204720.77942: done checking for any_errors_fatal 51243 1727204720.77943: checking for max_fail_percentage 51243 1727204720.77945: done checking for max_fail_percentage 51243 1727204720.77946: checking to see if all hosts have failed and the running result is not ok 51243 1727204720.77946: done checking to see if all hosts have failed 51243 1727204720.77947: getting the remaining hosts for this loop 51243 1727204720.77949: done getting the remaining hosts for this loop 51243 1727204720.77955: getting the next task for host managed-node3 51243 1727204720.77962: done getting next task for host managed-node3 51243 1727204720.77967: ^ task is: TASK: Include the task 'enable_epel.yml' 51243 1727204720.77970: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204720.77974: getting variables 51243 1727204720.77975: in VariableManager get_vars() 51243 1727204720.78008: Calling all_inventory to load vars for managed-node3 51243 1727204720.78011: Calling groups_inventory to load vars for managed-node3 51243 1727204720.78016: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204720.78034: Calling all_plugins_play to load vars for managed-node3 51243 1727204720.78038: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204720.78041: Calling groups_plugins_play to load vars for managed-node3 51243 1727204720.79063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204720.79655: done with get_vars() 51243 1727204720.79671: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.042) 0:00:03.406 ***** 51243 1727204720.79890: entering _queue_task() for managed-node3/include_tasks 51243 1727204720.80678: worker is 1 (out of 1 available) 51243 1727204720.80692: exiting _queue_task() for managed-node3/include_tasks 51243 1727204720.80703: done queuing things up, now waiting for results queue to drain 51243 1727204720.80705: waiting for pending results... 51243 1727204720.81070: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 51243 1727204720.81163: in run() - task 127b8e07-fff9-5c5d-847b-00000000015e 51243 1727204720.81169: variable 'ansible_search_path' from source: unknown 51243 1727204720.81172: variable 'ansible_search_path' from source: unknown 51243 1727204720.81495: calling self._execute() 51243 1727204720.81499: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204720.81502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204720.81505: variable 'omit' from source: magic vars 51243 1727204720.82622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 51243 1727204720.85434: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 51243 1727204720.85523: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 51243 1727204720.85573: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 51243 1727204720.85618: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 51243 1727204720.85652: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 51243 1727204720.85748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 51243 1727204720.85874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 51243 1727204720.85878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 51243 1727204720.85880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 51243 1727204720.85892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 51243 1727204720.86037: variable '__network_is_ostree' from source: set_fact 51243 1727204720.86062: Evaluated conditional (not __network_is_ostree | d(false)): True 51243 1727204720.86082: _execute() done 51243 1727204720.86095: dumping result to json 51243 1727204720.86102: done dumping result, returning 51243 1727204720.86119: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-5c5d-847b-00000000015e] 51243 1727204720.86128: sending task result for task 127b8e07-fff9-5c5d-847b-00000000015e 51243 1727204720.86363: no more pending results, returning what we have 51243 1727204720.86370: in VariableManager get_vars() 51243 1727204720.86410: Calling all_inventory to load vars for managed-node3 51243 1727204720.86413: Calling groups_inventory to load vars for managed-node3 51243 1727204720.86419: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204720.86432: Calling all_plugins_play to load vars for managed-node3 51243 1727204720.86435: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204720.86443: Calling groups_plugins_play to load vars for managed-node3 51243 1727204720.86872: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000015e 51243 1727204720.86877: WORKER PROCESS EXITING 51243 1727204720.86905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204720.87384: done with get_vars() 51243 1727204720.87392: variable 'ansible_search_path' from source: unknown 51243 1727204720.87393: variable 'ansible_search_path' from source: unknown 51243 1727204720.87429: we have included files to process 51243 1727204720.87430: generating all_blocks data 51243 1727204720.87431: done generating all_blocks data 51243 1727204720.87435: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 51243 1727204720.87436: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 51243 1727204720.87443: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 51243 1727204720.89005: done processing included file 51243 1727204720.89008: iterating over new_blocks loaded from include file 51243 1727204720.89010: in VariableManager get_vars() 51243 1727204720.89027: done with get_vars() 51243 1727204720.89029: filtering new block on tags 51243 1727204720.89055: done filtering new block on tags 51243 1727204720.89058: in VariableManager get_vars() 51243 1727204720.89102: done with get_vars() 51243 1727204720.89104: filtering new block on tags 51243 1727204720.89116: done filtering new block on tags 51243 1727204720.89118: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 51243 1727204720.89125: extending task lists for all hosts with included blocks 51243 1727204720.89239: done extending task lists 51243 1727204720.89240: done processing included files 51243 1727204720.89241: results queue empty 51243 1727204720.89242: checking for any_errors_fatal 51243 1727204720.89246: done checking for any_errors_fatal 51243 1727204720.89246: checking for max_fail_percentage 51243 1727204720.89248: done checking for max_fail_percentage 51243 1727204720.89248: checking to see if all hosts have failed and the running result is not ok 51243 1727204720.89249: done checking to see if all hosts have failed 51243 1727204720.89250: getting the remaining hosts for this loop 51243 1727204720.89251: done getting the remaining hosts for this loop 51243 1727204720.89254: getting the next task for host managed-node3 51243 1727204720.89257: done getting next task for host managed-node3 51243 1727204720.89260: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 51243 1727204720.89263: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204720.89481: getting variables 51243 1727204720.89483: in VariableManager get_vars() 51243 1727204720.89494: Calling all_inventory to load vars for managed-node3 51243 1727204720.89497: Calling groups_inventory to load vars for managed-node3 51243 1727204720.89500: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204720.89506: Calling all_plugins_play to load vars for managed-node3 51243 1727204720.89515: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204720.89519: Calling groups_plugins_play to load vars for managed-node3 51243 1727204720.89978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204720.90438: done with get_vars() 51243 1727204720.90452: done getting variables 51243 1727204720.90653: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 51243 1727204720.91259: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.116) 0:00:03.522 ***** 51243 1727204720.91518: entering _queue_task() for managed-node3/command 51243 1727204720.91520: Creating lock for command 51243 1727204720.92305: worker is 1 (out of 1 available) 51243 1727204720.92318: exiting _queue_task() for managed-node3/command 51243 1727204720.92329: done queuing things up, now waiting for results queue to drain 51243 1727204720.92330: waiting for pending results... 51243 1727204720.92664: running TaskExecutor() for managed-node3/TASK: Create EPEL 40 51243 1727204720.92945: in run() - task 127b8e07-fff9-5c5d-847b-000000000178 51243 1727204720.92959: variable 'ansible_search_path' from source: unknown 51243 1727204720.92963: variable 'ansible_search_path' from source: unknown 51243 1727204720.93111: calling self._execute() 51243 1727204720.93210: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204720.93219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204720.93228: variable 'omit' from source: magic vars 51243 1727204720.94359: variable 'ansible_distribution' from source: facts 51243 1727204720.94427: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 51243 1727204720.94431: when evaluation is False, skipping this task 51243 1727204720.94434: _execute() done 51243 1727204720.94437: dumping result to json 51243 1727204720.94439: done dumping result, returning 51243 1727204720.94442: done running TaskExecutor() for managed-node3/TASK: Create EPEL 40 [127b8e07-fff9-5c5d-847b-000000000178] 51243 1727204720.94444: sending task result for task 127b8e07-fff9-5c5d-847b-000000000178 51243 1727204720.94535: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000178 51243 1727204720.94539: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 51243 1727204720.94607: no more pending results, returning what we have 51243 1727204720.94610: results queue empty 51243 1727204720.94611: checking for any_errors_fatal 51243 1727204720.94613: done checking for any_errors_fatal 51243 1727204720.94614: checking for max_fail_percentage 51243 1727204720.94616: done checking for max_fail_percentage 51243 1727204720.94616: checking to see if all hosts have failed and the running result is not ok 51243 1727204720.94617: done checking to see if all hosts have failed 51243 1727204720.94618: getting the remaining hosts for this loop 51243 1727204720.94620: done getting the remaining hosts for this loop 51243 1727204720.94625: getting the next task for host managed-node3 51243 1727204720.94634: done getting next task for host managed-node3 51243 1727204720.94637: ^ task is: TASK: Install yum-utils package 51243 1727204720.94642: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204720.94646: getting variables 51243 1727204720.94648: in VariableManager get_vars() 51243 1727204720.94685: Calling all_inventory to load vars for managed-node3 51243 1727204720.94688: Calling groups_inventory to load vars for managed-node3 51243 1727204720.94692: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204720.94707: Calling all_plugins_play to load vars for managed-node3 51243 1727204720.94709: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204720.94712: Calling groups_plugins_play to load vars for managed-node3 51243 1727204720.95260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204720.95810: done with get_vars() 51243 1727204720.95825: done getting variables 51243 1727204720.95936: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.044) 0:00:03.567 ***** 51243 1727204720.96275: entering _queue_task() for managed-node3/package 51243 1727204720.96277: Creating lock for package 51243 1727204720.96840: worker is 1 (out of 1 available) 51243 1727204720.96855: exiting _queue_task() for managed-node3/package 51243 1727204720.97072: done queuing things up, now waiting for results queue to drain 51243 1727204720.97075: waiting for pending results... 51243 1727204720.97489: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 51243 1727204720.97802: in run() - task 127b8e07-fff9-5c5d-847b-000000000179 51243 1727204720.97806: variable 'ansible_search_path' from source: unknown 51243 1727204720.97809: variable 'ansible_search_path' from source: unknown 51243 1727204720.97812: calling self._execute() 51243 1727204720.97814: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204720.97817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204720.97828: variable 'omit' from source: magic vars 51243 1727204720.98272: variable 'ansible_distribution' from source: facts 51243 1727204720.98291: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 51243 1727204720.98299: when evaluation is False, skipping this task 51243 1727204720.98306: _execute() done 51243 1727204720.98314: dumping result to json 51243 1727204720.98321: done dumping result, returning 51243 1727204720.98335: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [127b8e07-fff9-5c5d-847b-000000000179] 51243 1727204720.98375: sending task result for task 127b8e07-fff9-5c5d-847b-000000000179 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 51243 1727204720.98636: no more pending results, returning what we have 51243 1727204720.98640: results queue empty 51243 1727204720.98641: checking for any_errors_fatal 51243 1727204720.98650: done checking for any_errors_fatal 51243 1727204720.98651: checking for max_fail_percentage 51243 1727204720.98653: done checking for max_fail_percentage 51243 1727204720.98654: checking to see if all hosts have failed and the running result is not ok 51243 1727204720.98655: done checking to see if all hosts have failed 51243 1727204720.98655: getting the remaining hosts for this loop 51243 1727204720.98657: done getting the remaining hosts for this loop 51243 1727204720.98662: getting the next task for host managed-node3 51243 1727204720.98672: done getting next task for host managed-node3 51243 1727204720.98676: ^ task is: TASK: Enable EPEL 7 51243 1727204720.98680: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204720.98684: getting variables 51243 1727204720.98687: in VariableManager get_vars() 51243 1727204720.98724: Calling all_inventory to load vars for managed-node3 51243 1727204720.98728: Calling groups_inventory to load vars for managed-node3 51243 1727204720.98736: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204720.98753: Calling all_plugins_play to load vars for managed-node3 51243 1727204720.98757: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204720.98760: Calling groups_plugins_play to load vars for managed-node3 51243 1727204720.98879: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000179 51243 1727204720.98883: WORKER PROCESS EXITING 51243 1727204720.99290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204720.99598: done with get_vars() 51243 1727204720.99612: done getting variables 51243 1727204720.99699: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:05:20 -0400 (0:00:00.034) 0:00:03.605 ***** 51243 1727204720.99746: entering _queue_task() for managed-node3/command 51243 1727204721.00226: worker is 1 (out of 1 available) 51243 1727204721.00243: exiting _queue_task() for managed-node3/command 51243 1727204721.00256: done queuing things up, now waiting for results queue to drain 51243 1727204721.00257: waiting for pending results... 51243 1727204721.00652: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 51243 1727204721.00702: in run() - task 127b8e07-fff9-5c5d-847b-00000000017a 51243 1727204721.00724: variable 'ansible_search_path' from source: unknown 51243 1727204721.00732: variable 'ansible_search_path' from source: unknown 51243 1727204721.00792: calling self._execute() 51243 1727204721.00973: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.00978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.00982: variable 'omit' from source: magic vars 51243 1727204721.01431: variable 'ansible_distribution' from source: facts 51243 1727204721.01510: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 51243 1727204721.01518: when evaluation is False, skipping this task 51243 1727204721.01522: _execute() done 51243 1727204721.01527: dumping result to json 51243 1727204721.01535: done dumping result, returning 51243 1727204721.01538: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [127b8e07-fff9-5c5d-847b-00000000017a] 51243 1727204721.01541: sending task result for task 127b8e07-fff9-5c5d-847b-00000000017a 51243 1727204721.01766: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000017a skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 51243 1727204721.01826: no more pending results, returning what we have 51243 1727204721.01838: results queue empty 51243 1727204721.01840: checking for any_errors_fatal 51243 1727204721.01853: done checking for any_errors_fatal 51243 1727204721.01854: checking for max_fail_percentage 51243 1727204721.01856: done checking for max_fail_percentage 51243 1727204721.01857: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.01858: done checking to see if all hosts have failed 51243 1727204721.01859: getting the remaining hosts for this loop 51243 1727204721.01860: done getting the remaining hosts for this loop 51243 1727204721.01867: getting the next task for host managed-node3 51243 1727204721.01911: done getting next task for host managed-node3 51243 1727204721.01915: ^ task is: TASK: Enable EPEL 8 51243 1727204721.01919: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.01923: getting variables 51243 1727204721.01925: in VariableManager get_vars() 51243 1727204721.02084: Calling all_inventory to load vars for managed-node3 51243 1727204721.02087: Calling groups_inventory to load vars for managed-node3 51243 1727204721.02091: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.02121: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.02124: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.02128: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.02479: WORKER PROCESS EXITING 51243 1727204721.02510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.02796: done with get_vars() 51243 1727204721.02808: done getting variables 51243 1727204721.02890: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.031) 0:00:03.636 ***** 51243 1727204721.02923: entering _queue_task() for managed-node3/command 51243 1727204721.03653: worker is 1 (out of 1 available) 51243 1727204721.03799: exiting _queue_task() for managed-node3/command 51243 1727204721.03811: done queuing things up, now waiting for results queue to drain 51243 1727204721.03813: waiting for pending results... 51243 1727204721.04135: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 51243 1727204721.04572: in run() - task 127b8e07-fff9-5c5d-847b-00000000017b 51243 1727204721.04579: variable 'ansible_search_path' from source: unknown 51243 1727204721.04583: variable 'ansible_search_path' from source: unknown 51243 1727204721.04588: calling self._execute() 51243 1727204721.04736: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.04793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.04808: variable 'omit' from source: magic vars 51243 1727204721.05774: variable 'ansible_distribution' from source: facts 51243 1727204721.05867: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 51243 1727204721.05873: when evaluation is False, skipping this task 51243 1727204721.05876: _execute() done 51243 1727204721.05878: dumping result to json 51243 1727204721.05881: done dumping result, returning 51243 1727204721.05884: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [127b8e07-fff9-5c5d-847b-00000000017b] 51243 1727204721.05886: sending task result for task 127b8e07-fff9-5c5d-847b-00000000017b 51243 1727204721.06068: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000017b 51243 1727204721.06275: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 51243 1727204721.06335: no more pending results, returning what we have 51243 1727204721.06338: results queue empty 51243 1727204721.06339: checking for any_errors_fatal 51243 1727204721.06347: done checking for any_errors_fatal 51243 1727204721.06348: checking for max_fail_percentage 51243 1727204721.06349: done checking for max_fail_percentage 51243 1727204721.06350: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.06351: done checking to see if all hosts have failed 51243 1727204721.06352: getting the remaining hosts for this loop 51243 1727204721.06353: done getting the remaining hosts for this loop 51243 1727204721.06358: getting the next task for host managed-node3 51243 1727204721.06375: done getting next task for host managed-node3 51243 1727204721.06378: ^ task is: TASK: Enable EPEL 6 51243 1727204721.06382: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.06386: getting variables 51243 1727204721.06387: in VariableManager get_vars() 51243 1727204721.06418: Calling all_inventory to load vars for managed-node3 51243 1727204721.06422: Calling groups_inventory to load vars for managed-node3 51243 1727204721.06426: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.06440: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.06444: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.06447: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.07086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.07371: done with get_vars() 51243 1727204721.07385: done getting variables 51243 1727204721.07452: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.045) 0:00:03.682 ***** 51243 1727204721.07494: entering _queue_task() for managed-node3/copy 51243 1727204721.07959: worker is 1 (out of 1 available) 51243 1727204721.07987: exiting _queue_task() for managed-node3/copy 51243 1727204721.08000: done queuing things up, now waiting for results queue to drain 51243 1727204721.08008: waiting for pending results... 51243 1727204721.08277: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 51243 1727204721.08418: in run() - task 127b8e07-fff9-5c5d-847b-00000000017d 51243 1727204721.08442: variable 'ansible_search_path' from source: unknown 51243 1727204721.08456: variable 'ansible_search_path' from source: unknown 51243 1727204721.08507: calling self._execute() 51243 1727204721.08637: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.08653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.08685: variable 'omit' from source: magic vars 51243 1727204721.09859: variable 'ansible_distribution' from source: facts 51243 1727204721.09905: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 51243 1727204721.09970: when evaluation is False, skipping this task 51243 1727204721.09977: _execute() done 51243 1727204721.09980: dumping result to json 51243 1727204721.09983: done dumping result, returning 51243 1727204721.09985: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [127b8e07-fff9-5c5d-847b-00000000017d] 51243 1727204721.09988: sending task result for task 127b8e07-fff9-5c5d-847b-00000000017d 51243 1727204721.10146: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000017d 51243 1727204721.10149: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 51243 1727204721.10247: no more pending results, returning what we have 51243 1727204721.10251: results queue empty 51243 1727204721.10252: checking for any_errors_fatal 51243 1727204721.10258: done checking for any_errors_fatal 51243 1727204721.10258: checking for max_fail_percentage 51243 1727204721.10261: done checking for max_fail_percentage 51243 1727204721.10261: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.10262: done checking to see if all hosts have failed 51243 1727204721.10263: getting the remaining hosts for this loop 51243 1727204721.10267: done getting the remaining hosts for this loop 51243 1727204721.10323: getting the next task for host managed-node3 51243 1727204721.10337: done getting next task for host managed-node3 51243 1727204721.10340: ^ task is: TASK: Set network provider to 'nm' 51243 1727204721.10343: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.10347: getting variables 51243 1727204721.10349: in VariableManager get_vars() 51243 1727204721.10455: Calling all_inventory to load vars for managed-node3 51243 1727204721.10458: Calling groups_inventory to load vars for managed-node3 51243 1727204721.10462: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.10518: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.10522: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.10526: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.10900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.11156: done with get_vars() 51243 1727204721.11170: done getting variables 51243 1727204721.11249: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.037) 0:00:03.720 ***** 51243 1727204721.11284: entering _queue_task() for managed-node3/set_fact 51243 1727204721.11677: worker is 1 (out of 1 available) 51243 1727204721.11691: exiting _queue_task() for managed-node3/set_fact 51243 1727204721.11704: done queuing things up, now waiting for results queue to drain 51243 1727204721.11705: waiting for pending results... 51243 1727204721.12103: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 51243 1727204721.12173: in run() - task 127b8e07-fff9-5c5d-847b-000000000007 51243 1727204721.12177: variable 'ansible_search_path' from source: unknown 51243 1727204721.12195: calling self._execute() 51243 1727204721.12292: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.12318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.12336: variable 'omit' from source: magic vars 51243 1727204721.12470: variable 'omit' from source: magic vars 51243 1727204721.12507: variable 'omit' from source: magic vars 51243 1727204721.12564: variable 'omit' from source: magic vars 51243 1727204721.12630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 51243 1727204721.12679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 51243 1727204721.12707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 51243 1727204721.12738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204721.12757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 51243 1727204721.12798: variable 'inventory_hostname' from source: host vars for 'managed-node3' 51243 1727204721.12850: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.12854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.12938: Set connection var ansible_shell_type to sh 51243 1727204721.12964: Set connection var ansible_module_compression to ZIP_DEFLATED 51243 1727204721.12974: Set connection var ansible_connection to ssh 51243 1727204721.12986: Set connection var ansible_pipelining to False 51243 1727204721.12998: Set connection var ansible_shell_executable to /bin/sh 51243 1727204721.13008: Set connection var ansible_timeout to 10 51243 1727204721.13067: variable 'ansible_shell_executable' from source: unknown 51243 1727204721.13072: variable 'ansible_connection' from source: unknown 51243 1727204721.13075: variable 'ansible_module_compression' from source: unknown 51243 1727204721.13077: variable 'ansible_shell_type' from source: unknown 51243 1727204721.13079: variable 'ansible_shell_executable' from source: unknown 51243 1727204721.13081: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.13083: variable 'ansible_pipelining' from source: unknown 51243 1727204721.13085: variable 'ansible_timeout' from source: unknown 51243 1727204721.13087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.13262: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 51243 1727204721.13322: variable 'omit' from source: magic vars 51243 1727204721.13325: starting attempt loop 51243 1727204721.13328: running the handler 51243 1727204721.13330: handler run complete 51243 1727204721.13335: attempt loop complete, returning result 51243 1727204721.13341: _execute() done 51243 1727204721.13348: dumping result to json 51243 1727204721.13355: done dumping result, returning 51243 1727204721.13367: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [127b8e07-fff9-5c5d-847b-000000000007] 51243 1727204721.13392: sending task result for task 127b8e07-fff9-5c5d-847b-000000000007 51243 1727204721.13523: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000007 51243 1727204721.13527: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 51243 1727204721.13593: no more pending results, returning what we have 51243 1727204721.13596: results queue empty 51243 1727204721.13597: checking for any_errors_fatal 51243 1727204721.13750: done checking for any_errors_fatal 51243 1727204721.13751: checking for max_fail_percentage 51243 1727204721.13753: done checking for max_fail_percentage 51243 1727204721.13754: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.13755: done checking to see if all hosts have failed 51243 1727204721.13756: getting the remaining hosts for this loop 51243 1727204721.13757: done getting the remaining hosts for this loop 51243 1727204721.13761: getting the next task for host managed-node3 51243 1727204721.13770: done getting next task for host managed-node3 51243 1727204721.13772: ^ task is: TASK: meta (flush_handlers) 51243 1727204721.13774: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.13778: getting variables 51243 1727204721.13780: in VariableManager get_vars() 51243 1727204721.13809: Calling all_inventory to load vars for managed-node3 51243 1727204721.13813: Calling groups_inventory to load vars for managed-node3 51243 1727204721.13816: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.13830: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.13837: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.13841: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.14175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.14456: done with get_vars() 51243 1727204721.14471: done getting variables 51243 1727204721.14551: in VariableManager get_vars() 51243 1727204721.14561: Calling all_inventory to load vars for managed-node3 51243 1727204721.14564: Calling groups_inventory to load vars for managed-node3 51243 1727204721.14569: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.14574: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.14576: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.14579: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.14811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.15050: done with get_vars() 51243 1727204721.15189: done queuing things up, now waiting for results queue to drain 51243 1727204721.15191: results queue empty 51243 1727204721.15192: checking for any_errors_fatal 51243 1727204721.15195: done checking for any_errors_fatal 51243 1727204721.15196: checking for max_fail_percentage 51243 1727204721.15197: done checking for max_fail_percentage 51243 1727204721.15198: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.15199: done checking to see if all hosts have failed 51243 1727204721.15200: getting the remaining hosts for this loop 51243 1727204721.15201: done getting the remaining hosts for this loop 51243 1727204721.15204: getting the next task for host managed-node3 51243 1727204721.15208: done getting next task for host managed-node3 51243 1727204721.15209: ^ task is: TASK: meta (flush_handlers) 51243 1727204721.15211: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.15221: getting variables 51243 1727204721.15222: in VariableManager get_vars() 51243 1727204721.15235: Calling all_inventory to load vars for managed-node3 51243 1727204721.15237: Calling groups_inventory to load vars for managed-node3 51243 1727204721.15240: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.15245: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.15248: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.15251: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.15478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.15724: done with get_vars() 51243 1727204721.15736: done getting variables 51243 1727204721.15802: in VariableManager get_vars() 51243 1727204721.15812: Calling all_inventory to load vars for managed-node3 51243 1727204721.15814: Calling groups_inventory to load vars for managed-node3 51243 1727204721.15817: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.15822: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.15824: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.15827: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.16026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.16260: done with get_vars() 51243 1727204721.16277: done queuing things up, now waiting for results queue to drain 51243 1727204721.16279: results queue empty 51243 1727204721.16280: checking for any_errors_fatal 51243 1727204721.16281: done checking for any_errors_fatal 51243 1727204721.16282: checking for max_fail_percentage 51243 1727204721.16283: done checking for max_fail_percentage 51243 1727204721.16288: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.16289: done checking to see if all hosts have failed 51243 1727204721.16290: getting the remaining hosts for this loop 51243 1727204721.16291: done getting the remaining hosts for this loop 51243 1727204721.16294: getting the next task for host managed-node3 51243 1727204721.16301: done getting next task for host managed-node3 51243 1727204721.16303: ^ task is: None 51243 1727204721.16304: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.16306: done queuing things up, now waiting for results queue to drain 51243 1727204721.16307: results queue empty 51243 1727204721.16307: checking for any_errors_fatal 51243 1727204721.16308: done checking for any_errors_fatal 51243 1727204721.16309: checking for max_fail_percentage 51243 1727204721.16310: done checking for max_fail_percentage 51243 1727204721.16311: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.16311: done checking to see if all hosts have failed 51243 1727204721.16313: getting the next task for host managed-node3 51243 1727204721.16316: done getting next task for host managed-node3 51243 1727204721.16316: ^ task is: None 51243 1727204721.16318: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.16371: in VariableManager get_vars() 51243 1727204721.16426: done with get_vars() 51243 1727204721.16435: in VariableManager get_vars() 51243 1727204721.16454: done with get_vars() 51243 1727204721.16458: variable 'omit' from source: magic vars 51243 1727204721.16489: in VariableManager get_vars() 51243 1727204721.16512: done with get_vars() 51243 1727204721.16541: variable 'omit' from source: magic vars PLAY [Play for testing wireless connection] ************************************ 51243 1727204721.17590: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 51243 1727204721.17645: getting the remaining hosts for this loop 51243 1727204721.17647: done getting the remaining hosts for this loop 51243 1727204721.17650: getting the next task for host managed-node3 51243 1727204721.17654: done getting next task for host managed-node3 51243 1727204721.17656: ^ task is: TASK: Gathering Facts 51243 1727204721.17658: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.17660: getting variables 51243 1727204721.17661: in VariableManager get_vars() 51243 1727204721.17684: Calling all_inventory to load vars for managed-node3 51243 1727204721.17687: Calling groups_inventory to load vars for managed-node3 51243 1727204721.17689: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.17695: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.17730: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.17738: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.17950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.18223: done with get_vars() 51243 1727204721.18235: done getting variables 51243 1727204721.18297: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.070) 0:00:03.790 ***** 51243 1727204721.18324: entering _queue_task() for managed-node3/gather_facts 51243 1727204721.18957: worker is 1 (out of 1 available) 51243 1727204721.18968: exiting _queue_task() for managed-node3/gather_facts 51243 1727204721.18981: done queuing things up, now waiting for results queue to drain 51243 1727204721.18983: waiting for pending results... 51243 1727204721.19178: running TaskExecutor() for managed-node3/TASK: Gathering Facts 51243 1727204721.19306: in run() - task 127b8e07-fff9-5c5d-847b-0000000001a3 51243 1727204721.19330: variable 'ansible_search_path' from source: unknown 51243 1727204721.19395: calling self._execute() 51243 1727204721.19502: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.19515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.19537: variable 'omit' from source: magic vars 51243 1727204721.19999: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.20021: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.20173: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.20185: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.20252: when evaluation is False, skipping this task 51243 1727204721.20257: _execute() done 51243 1727204721.20260: dumping result to json 51243 1727204721.20262: done dumping result, returning 51243 1727204721.20265: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-5c5d-847b-0000000001a3] 51243 1727204721.20269: sending task result for task 127b8e07-fff9-5c5d-847b-0000000001a3 51243 1727204721.20349: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000001a3 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.20412: no more pending results, returning what we have 51243 1727204721.20418: results queue empty 51243 1727204721.20419: checking for any_errors_fatal 51243 1727204721.20421: done checking for any_errors_fatal 51243 1727204721.20422: checking for max_fail_percentage 51243 1727204721.20423: done checking for max_fail_percentage 51243 1727204721.20424: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.20425: done checking to see if all hosts have failed 51243 1727204721.20426: getting the remaining hosts for this loop 51243 1727204721.20428: done getting the remaining hosts for this loop 51243 1727204721.20435: getting the next task for host managed-node3 51243 1727204721.20444: done getting next task for host managed-node3 51243 1727204721.20446: ^ task is: TASK: meta (flush_handlers) 51243 1727204721.20449: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.20453: getting variables 51243 1727204721.20455: in VariableManager get_vars() 51243 1727204721.20515: Calling all_inventory to load vars for managed-node3 51243 1727204721.20519: Calling groups_inventory to load vars for managed-node3 51243 1727204721.20521: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.20542: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.20547: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.20551: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.21167: WORKER PROCESS EXITING 51243 1727204721.21194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.21488: done with get_vars() 51243 1727204721.21504: done getting variables 51243 1727204721.21604: in VariableManager get_vars() 51243 1727204721.21640: Calling all_inventory to load vars for managed-node3 51243 1727204721.21644: Calling groups_inventory to load vars for managed-node3 51243 1727204721.21646: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.21653: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.21655: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.21658: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.21888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.22320: done with get_vars() 51243 1727204721.22339: done queuing things up, now waiting for results queue to drain 51243 1727204721.22342: results queue empty 51243 1727204721.22342: checking for any_errors_fatal 51243 1727204721.22346: done checking for any_errors_fatal 51243 1727204721.22346: checking for max_fail_percentage 51243 1727204721.22347: done checking for max_fail_percentage 51243 1727204721.22348: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.22350: done checking to see if all hosts have failed 51243 1727204721.22350: getting the remaining hosts for this loop 51243 1727204721.22351: done getting the remaining hosts for this loop 51243 1727204721.22354: getting the next task for host managed-node3 51243 1727204721.22358: done getting next task for host managed-node3 51243 1727204721.22361: ^ task is: TASK: INIT: wireless tests 51243 1727204721.22362: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.22365: getting variables 51243 1727204721.22373: in VariableManager get_vars() 51243 1727204721.22394: Calling all_inventory to load vars for managed-node3 51243 1727204721.22396: Calling groups_inventory to load vars for managed-node3 51243 1727204721.22399: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.22405: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.22407: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.22411: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.22760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.23176: done with get_vars() 51243 1727204721.23188: done getting variables 51243 1727204721.23535: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT: wireless tests] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:8 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.052) 0:00:03.843 ***** 51243 1727204721.23611: entering _queue_task() for managed-node3/debug 51243 1727204721.23613: Creating lock for debug 51243 1727204721.24093: worker is 1 (out of 1 available) 51243 1727204721.24207: exiting _queue_task() for managed-node3/debug 51243 1727204721.24221: done queuing things up, now waiting for results queue to drain 51243 1727204721.24223: waiting for pending results... 51243 1727204721.24694: running TaskExecutor() for managed-node3/TASK: INIT: wireless tests 51243 1727204721.24817: in run() - task 127b8e07-fff9-5c5d-847b-00000000000b 51243 1727204721.24922: variable 'ansible_search_path' from source: unknown 51243 1727204721.24926: calling self._execute() 51243 1727204721.24988: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.25000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.25015: variable 'omit' from source: magic vars 51243 1727204721.25430: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.25449: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.25590: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.25602: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.25609: when evaluation is False, skipping this task 51243 1727204721.25617: _execute() done 51243 1727204721.25624: dumping result to json 51243 1727204721.25631: done dumping result, returning 51243 1727204721.25685: done running TaskExecutor() for managed-node3/TASK: INIT: wireless tests [127b8e07-fff9-5c5d-847b-00000000000b] 51243 1727204721.25698: sending task result for task 127b8e07-fff9-5c5d-847b-00000000000b skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204721.25983: no more pending results, returning what we have 51243 1727204721.25986: results queue empty 51243 1727204721.25987: checking for any_errors_fatal 51243 1727204721.25990: done checking for any_errors_fatal 51243 1727204721.25991: checking for max_fail_percentage 51243 1727204721.25994: done checking for max_fail_percentage 51243 1727204721.25995: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.25996: done checking to see if all hosts have failed 51243 1727204721.25996: getting the remaining hosts for this loop 51243 1727204721.25998: done getting the remaining hosts for this loop 51243 1727204721.26003: getting the next task for host managed-node3 51243 1727204721.26011: done getting next task for host managed-node3 51243 1727204721.26017: ^ task is: TASK: Include the task 'setup_mock_wifi.yml' 51243 1727204721.26020: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.26024: getting variables 51243 1727204721.26027: in VariableManager get_vars() 51243 1727204721.26092: Calling all_inventory to load vars for managed-node3 51243 1727204721.26096: Calling groups_inventory to load vars for managed-node3 51243 1727204721.26099: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.26115: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.26118: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.26122: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.26450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.26789: done with get_vars() 51243 1727204721.26802: done getting variables 51243 1727204721.26837: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000000b 51243 1727204721.26840: WORKER PROCESS EXITING TASK [Include the task 'setup_mock_wifi.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.033) 0:00:03.876 ***** 51243 1727204721.26917: entering _queue_task() for managed-node3/include_tasks 51243 1727204721.27294: worker is 1 (out of 1 available) 51243 1727204721.27307: exiting _queue_task() for managed-node3/include_tasks 51243 1727204721.27435: done queuing things up, now waiting for results queue to drain 51243 1727204721.27438: waiting for pending results... 51243 1727204721.27551: running TaskExecutor() for managed-node3/TASK: Include the task 'setup_mock_wifi.yml' 51243 1727204721.27718: in run() - task 127b8e07-fff9-5c5d-847b-00000000000c 51243 1727204721.27775: variable 'ansible_search_path' from source: unknown 51243 1727204721.27828: calling self._execute() 51243 1727204721.27927: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.27943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.27958: variable 'omit' from source: magic vars 51243 1727204721.28404: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.28423: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.28569: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.28581: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.28588: when evaluation is False, skipping this task 51243 1727204721.28596: _execute() done 51243 1727204721.28603: dumping result to json 51243 1727204721.28610: done dumping result, returning 51243 1727204721.28619: done running TaskExecutor() for managed-node3/TASK: Include the task 'setup_mock_wifi.yml' [127b8e07-fff9-5c5d-847b-00000000000c] 51243 1727204721.28628: sending task result for task 127b8e07-fff9-5c5d-847b-00000000000c skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.28818: no more pending results, returning what we have 51243 1727204721.28822: results queue empty 51243 1727204721.28823: checking for any_errors_fatal 51243 1727204721.28836: done checking for any_errors_fatal 51243 1727204721.28836: checking for max_fail_percentage 51243 1727204721.28839: done checking for max_fail_percentage 51243 1727204721.28840: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.28841: done checking to see if all hosts have failed 51243 1727204721.28841: getting the remaining hosts for this loop 51243 1727204721.28843: done getting the remaining hosts for this loop 51243 1727204721.28849: getting the next task for host managed-node3 51243 1727204721.28856: done getting next task for host managed-node3 51243 1727204721.28859: ^ task is: TASK: Copy client certs 51243 1727204721.28862: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.28968: getting variables 51243 1727204721.28971: in VariableManager get_vars() 51243 1727204721.29031: Calling all_inventory to load vars for managed-node3 51243 1727204721.29037: Calling groups_inventory to load vars for managed-node3 51243 1727204721.29039: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.29055: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.29058: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.29061: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.29506: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000000c 51243 1727204721.29510: WORKER PROCESS EXITING 51243 1727204721.29540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.29793: done with get_vars() 51243 1727204721.29810: done getting variables 51243 1727204721.29880: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Copy client certs] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.029) 0:00:03.906 ***** 51243 1727204721.29914: entering _queue_task() for managed-node3/copy 51243 1727204721.30379: worker is 1 (out of 1 available) 51243 1727204721.30392: exiting _queue_task() for managed-node3/copy 51243 1727204721.30403: done queuing things up, now waiting for results queue to drain 51243 1727204721.30405: waiting for pending results... 51243 1727204721.30592: running TaskExecutor() for managed-node3/TASK: Copy client certs 51243 1727204721.30704: in run() - task 127b8e07-fff9-5c5d-847b-00000000000d 51243 1727204721.30725: variable 'ansible_search_path' from source: unknown 51243 1727204721.31019: Loaded config def from plugin (lookup/items) 51243 1727204721.31035: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 51243 1727204721.31105: variable 'omit' from source: magic vars 51243 1727204721.31257: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.31275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.31296: variable 'omit' from source: magic vars 51243 1727204721.31772: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.31778: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.31887: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.31900: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.31908: when evaluation is False, skipping this task 51243 1727204721.31949: variable 'item' from source: unknown 51243 1727204721.32047: variable 'item' from source: unknown skipping: [managed-node3] => (item=client.key) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.key", "skip_reason": "Conditional result was False" } 51243 1727204721.32438: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.32441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.32444: variable 'omit' from source: magic vars 51243 1727204721.32546: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.32558: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.32690: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.32748: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.32751: when evaluation is False, skipping this task 51243 1727204721.32756: variable 'item' from source: unknown 51243 1727204721.32822: variable 'item' from source: unknown skipping: [managed-node3] => (item=client.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.pem", "skip_reason": "Conditional result was False" } 51243 1727204721.33185: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.33189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.33191: variable 'omit' from source: magic vars 51243 1727204721.33203: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.33214: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.33345: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.33356: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.33362: when evaluation is False, skipping this task 51243 1727204721.33400: variable 'item' from source: unknown 51243 1727204721.33474: variable 'item' from source: unknown skipping: [managed-node3] => (item=cacert.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "cacert.pem", "skip_reason": "Conditional result was False" } 51243 1727204721.33675: dumping result to json 51243 1727204721.33678: done dumping result, returning 51243 1727204721.33681: done running TaskExecutor() for managed-node3/TASK: Copy client certs [127b8e07-fff9-5c5d-847b-00000000000d] 51243 1727204721.33684: sending task result for task 127b8e07-fff9-5c5d-847b-00000000000d skipping: [managed-node3] => { "changed": false } MSG: All items skipped 51243 1727204721.33836: no more pending results, returning what we have 51243 1727204721.33840: results queue empty 51243 1727204721.33841: checking for any_errors_fatal 51243 1727204721.33846: done checking for any_errors_fatal 51243 1727204721.33847: checking for max_fail_percentage 51243 1727204721.33850: done checking for max_fail_percentage 51243 1727204721.33850: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.33851: done checking to see if all hosts have failed 51243 1727204721.33852: getting the remaining hosts for this loop 51243 1727204721.33854: done getting the remaining hosts for this loop 51243 1727204721.33859: getting the next task for host managed-node3 51243 1727204721.33874: done getting next task for host managed-node3 51243 1727204721.33878: ^ task is: TASK: TEST: wireless connection with WPA-PSK 51243 1727204721.33881: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.33885: getting variables 51243 1727204721.33887: in VariableManager get_vars() 51243 1727204721.33946: Calling all_inventory to load vars for managed-node3 51243 1727204721.33949: Calling groups_inventory to load vars for managed-node3 51243 1727204721.33952: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.34073: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000000d 51243 1727204721.34077: WORKER PROCESS EXITING 51243 1727204721.34093: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.34096: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.34100: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.34469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.34716: done with get_vars() 51243 1727204721.34729: done getting variables 51243 1727204721.34798: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with WPA-PSK] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.049) 0:00:03.956 ***** 51243 1727204721.34835: entering _queue_task() for managed-node3/debug 51243 1727204721.35277: worker is 1 (out of 1 available) 51243 1727204721.35290: exiting _queue_task() for managed-node3/debug 51243 1727204721.35301: done queuing things up, now waiting for results queue to drain 51243 1727204721.35303: waiting for pending results... 51243 1727204721.35496: running TaskExecutor() for managed-node3/TASK: TEST: wireless connection with WPA-PSK 51243 1727204721.35617: in run() - task 127b8e07-fff9-5c5d-847b-00000000000f 51243 1727204721.35645: variable 'ansible_search_path' from source: unknown 51243 1727204721.35698: calling self._execute() 51243 1727204721.35803: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.35817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.35836: variable 'omit' from source: magic vars 51243 1727204721.36286: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.36309: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.36471: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.36474: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.36477: when evaluation is False, skipping this task 51243 1727204721.36480: _execute() done 51243 1727204721.36482: dumping result to json 51243 1727204721.36553: done dumping result, returning 51243 1727204721.36556: done running TaskExecutor() for managed-node3/TASK: TEST: wireless connection with WPA-PSK [127b8e07-fff9-5c5d-847b-00000000000f] 51243 1727204721.36559: sending task result for task 127b8e07-fff9-5c5d-847b-00000000000f 51243 1727204721.36638: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000000f 51243 1727204721.36642: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204721.36701: no more pending results, returning what we have 51243 1727204721.36705: results queue empty 51243 1727204721.36706: checking for any_errors_fatal 51243 1727204721.36715: done checking for any_errors_fatal 51243 1727204721.36716: checking for max_fail_percentage 51243 1727204721.36718: done checking for max_fail_percentage 51243 1727204721.36719: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.36720: done checking to see if all hosts have failed 51243 1727204721.36721: getting the remaining hosts for this loop 51243 1727204721.36723: done getting the remaining hosts for this loop 51243 1727204721.36727: getting the next task for host managed-node3 51243 1727204721.36739: done getting next task for host managed-node3 51243 1727204721.36747: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51243 1727204721.36751: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.36890: getting variables 51243 1727204721.36892: in VariableManager get_vars() 51243 1727204721.36952: Calling all_inventory to load vars for managed-node3 51243 1727204721.36955: Calling groups_inventory to load vars for managed-node3 51243 1727204721.36958: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.37073: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.37077: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.37081: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.37393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.37657: done with get_vars() 51243 1727204721.37672: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.029) 0:00:03.985 ***** 51243 1727204721.37788: entering _queue_task() for managed-node3/include_tasks 51243 1727204721.38156: worker is 1 (out of 1 available) 51243 1727204721.38300: exiting _queue_task() for managed-node3/include_tasks 51243 1727204721.38313: done queuing things up, now waiting for results queue to drain 51243 1727204721.38314: waiting for pending results... 51243 1727204721.38640: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51243 1727204721.38648: in run() - task 127b8e07-fff9-5c5d-847b-000000000017 51243 1727204721.38673: variable 'ansible_search_path' from source: unknown 51243 1727204721.38681: variable 'ansible_search_path' from source: unknown 51243 1727204721.38725: calling self._execute() 51243 1727204721.38844: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.38850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.38872: variable 'omit' from source: magic vars 51243 1727204721.39307: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.39313: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.39491: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.39495: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.39498: when evaluation is False, skipping this task 51243 1727204721.39500: _execute() done 51243 1727204721.39502: dumping result to json 51243 1727204721.39504: done dumping result, returning 51243 1727204721.39507: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-5c5d-847b-000000000017] 51243 1727204721.39509: sending task result for task 127b8e07-fff9-5c5d-847b-000000000017 51243 1727204721.39748: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000017 51243 1727204721.39751: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.39807: no more pending results, returning what we have 51243 1727204721.39925: results queue empty 51243 1727204721.39926: checking for any_errors_fatal 51243 1727204721.39934: done checking for any_errors_fatal 51243 1727204721.39936: checking for max_fail_percentage 51243 1727204721.39937: done checking for max_fail_percentage 51243 1727204721.39938: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.39939: done checking to see if all hosts have failed 51243 1727204721.39940: getting the remaining hosts for this loop 51243 1727204721.39942: done getting the remaining hosts for this loop 51243 1727204721.39946: getting the next task for host managed-node3 51243 1727204721.39953: done getting next task for host managed-node3 51243 1727204721.39957: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 51243 1727204721.39961: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.39980: getting variables 51243 1727204721.39982: in VariableManager get_vars() 51243 1727204721.40030: Calling all_inventory to load vars for managed-node3 51243 1727204721.40035: Calling groups_inventory to load vars for managed-node3 51243 1727204721.40038: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.40048: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.40051: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.40054: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.40637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.40898: done with get_vars() 51243 1727204721.40910: done getting variables 51243 1727204721.40981: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.032) 0:00:04.017 ***** 51243 1727204721.41014: entering _queue_task() for managed-node3/debug 51243 1727204721.41411: worker is 1 (out of 1 available) 51243 1727204721.41423: exiting _queue_task() for managed-node3/debug 51243 1727204721.41439: done queuing things up, now waiting for results queue to drain 51243 1727204721.41440: waiting for pending results... 51243 1727204721.41715: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 51243 1727204721.41855: in run() - task 127b8e07-fff9-5c5d-847b-000000000018 51243 1727204721.41881: variable 'ansible_search_path' from source: unknown 51243 1727204721.41890: variable 'ansible_search_path' from source: unknown 51243 1727204721.41945: calling self._execute() 51243 1727204721.42051: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.42068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.42087: variable 'omit' from source: magic vars 51243 1727204721.42515: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.42537: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.42771: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.42775: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.42780: when evaluation is False, skipping this task 51243 1727204721.42783: _execute() done 51243 1727204721.42785: dumping result to json 51243 1727204721.42793: done dumping result, returning 51243 1727204721.42799: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-5c5d-847b-000000000018] 51243 1727204721.42802: sending task result for task 127b8e07-fff9-5c5d-847b-000000000018 51243 1727204721.42887: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000018 51243 1727204721.42891: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204721.42964: no more pending results, returning what we have 51243 1727204721.42971: results queue empty 51243 1727204721.42972: checking for any_errors_fatal 51243 1727204721.42980: done checking for any_errors_fatal 51243 1727204721.42981: checking for max_fail_percentage 51243 1727204721.42982: done checking for max_fail_percentage 51243 1727204721.42983: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.42984: done checking to see if all hosts have failed 51243 1727204721.42985: getting the remaining hosts for this loop 51243 1727204721.42986: done getting the remaining hosts for this loop 51243 1727204721.42991: getting the next task for host managed-node3 51243 1727204721.42999: done getting next task for host managed-node3 51243 1727204721.43003: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51243 1727204721.43013: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.43031: getting variables 51243 1727204721.43036: in VariableManager get_vars() 51243 1727204721.43148: Calling all_inventory to load vars for managed-node3 51243 1727204721.43151: Calling groups_inventory to load vars for managed-node3 51243 1727204721.43153: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.43168: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.43171: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.43174: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.43629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.43899: done with get_vars() 51243 1727204721.43919: done getting variables 51243 1727204721.44035: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.030) 0:00:04.048 ***** 51243 1727204721.44076: entering _queue_task() for managed-node3/fail 51243 1727204721.44078: Creating lock for fail 51243 1727204721.44566: worker is 1 (out of 1 available) 51243 1727204721.44581: exiting _queue_task() for managed-node3/fail 51243 1727204721.44592: done queuing things up, now waiting for results queue to drain 51243 1727204721.44594: waiting for pending results... 51243 1727204721.44837: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51243 1727204721.45002: in run() - task 127b8e07-fff9-5c5d-847b-000000000019 51243 1727204721.45006: variable 'ansible_search_path' from source: unknown 51243 1727204721.45009: variable 'ansible_search_path' from source: unknown 51243 1727204721.45054: calling self._execute() 51243 1727204721.45220: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.45225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.45228: variable 'omit' from source: magic vars 51243 1727204721.45635: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.45663: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.45900: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.45911: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.45920: when evaluation is False, skipping this task 51243 1727204721.45928: _execute() done 51243 1727204721.45939: dumping result to json 51243 1727204721.45947: done dumping result, returning 51243 1727204721.45959: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-5c5d-847b-000000000019] 51243 1727204721.45983: sending task result for task 127b8e07-fff9-5c5d-847b-000000000019 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.46252: no more pending results, returning what we have 51243 1727204721.46256: results queue empty 51243 1727204721.46257: checking for any_errors_fatal 51243 1727204721.46266: done checking for any_errors_fatal 51243 1727204721.46267: checking for max_fail_percentage 51243 1727204721.46269: done checking for max_fail_percentage 51243 1727204721.46270: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.46271: done checking to see if all hosts have failed 51243 1727204721.46272: getting the remaining hosts for this loop 51243 1727204721.46273: done getting the remaining hosts for this loop 51243 1727204721.46278: getting the next task for host managed-node3 51243 1727204721.46287: done getting next task for host managed-node3 51243 1727204721.46291: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51243 1727204721.46295: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.46477: getting variables 51243 1727204721.46479: in VariableManager get_vars() 51243 1727204721.46529: Calling all_inventory to load vars for managed-node3 51243 1727204721.46535: Calling groups_inventory to load vars for managed-node3 51243 1727204721.46538: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.46549: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.46552: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.46555: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.46869: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000019 51243 1727204721.46872: WORKER PROCESS EXITING 51243 1727204721.46902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.47163: done with get_vars() 51243 1727204721.47178: done getting variables 51243 1727204721.47254: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.032) 0:00:04.080 ***** 51243 1727204721.47293: entering _queue_task() for managed-node3/fail 51243 1727204721.47637: worker is 1 (out of 1 available) 51243 1727204721.47650: exiting _queue_task() for managed-node3/fail 51243 1727204721.47779: done queuing things up, now waiting for results queue to drain 51243 1727204721.47781: waiting for pending results... 51243 1727204721.47955: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51243 1727204721.48116: in run() - task 127b8e07-fff9-5c5d-847b-00000000001a 51243 1727204721.48141: variable 'ansible_search_path' from source: unknown 51243 1727204721.48151: variable 'ansible_search_path' from source: unknown 51243 1727204721.48202: calling self._execute() 51243 1727204721.48311: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.48326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.48348: variable 'omit' from source: magic vars 51243 1727204721.48817: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.48841: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.48988: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.49000: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.49007: when evaluation is False, skipping this task 51243 1727204721.49015: _execute() done 51243 1727204721.49023: dumping result to json 51243 1727204721.49031: done dumping result, returning 51243 1727204721.49048: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-5c5d-847b-00000000001a] 51243 1727204721.49060: sending task result for task 127b8e07-fff9-5c5d-847b-00000000001a 51243 1727204721.49312: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000001a 51243 1727204721.49317: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.49375: no more pending results, returning what we have 51243 1727204721.49379: results queue empty 51243 1727204721.49381: checking for any_errors_fatal 51243 1727204721.49388: done checking for any_errors_fatal 51243 1727204721.49389: checking for max_fail_percentage 51243 1727204721.49391: done checking for max_fail_percentage 51243 1727204721.49392: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.49393: done checking to see if all hosts have failed 51243 1727204721.49398: getting the remaining hosts for this loop 51243 1727204721.49400: done getting the remaining hosts for this loop 51243 1727204721.49405: getting the next task for host managed-node3 51243 1727204721.49413: done getting next task for host managed-node3 51243 1727204721.49417: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51243 1727204721.49422: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.49443: getting variables 51243 1727204721.49446: in VariableManager get_vars() 51243 1727204721.49619: Calling all_inventory to load vars for managed-node3 51243 1727204721.49622: Calling groups_inventory to load vars for managed-node3 51243 1727204721.49625: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.49639: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.49642: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.49646: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.49949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.50251: done with get_vars() 51243 1727204721.50269: done getting variables 51243 1727204721.50338: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.030) 0:00:04.111 ***** 51243 1727204721.50376: entering _queue_task() for managed-node3/fail 51243 1727204721.50844: worker is 1 (out of 1 available) 51243 1727204721.50857: exiting _queue_task() for managed-node3/fail 51243 1727204721.50871: done queuing things up, now waiting for results queue to drain 51243 1727204721.50872: waiting for pending results... 51243 1727204721.51073: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51243 1727204721.51363: in run() - task 127b8e07-fff9-5c5d-847b-00000000001b 51243 1727204721.51369: variable 'ansible_search_path' from source: unknown 51243 1727204721.51371: variable 'ansible_search_path' from source: unknown 51243 1727204721.51374: calling self._execute() 51243 1727204721.51427: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.51445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.51461: variable 'omit' from source: magic vars 51243 1727204721.51927: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.51951: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.52093: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.52105: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.52112: when evaluation is False, skipping this task 51243 1727204721.52125: _execute() done 51243 1727204721.52142: dumping result to json 51243 1727204721.52152: done dumping result, returning 51243 1727204721.52242: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-5c5d-847b-00000000001b] 51243 1727204721.52248: sending task result for task 127b8e07-fff9-5c5d-847b-00000000001b 51243 1727204721.52330: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000001b 51243 1727204721.52336: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.52397: no more pending results, returning what we have 51243 1727204721.52401: results queue empty 51243 1727204721.52402: checking for any_errors_fatal 51243 1727204721.52410: done checking for any_errors_fatal 51243 1727204721.52410: checking for max_fail_percentage 51243 1727204721.52412: done checking for max_fail_percentage 51243 1727204721.52413: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.52414: done checking to see if all hosts have failed 51243 1727204721.52415: getting the remaining hosts for this loop 51243 1727204721.52417: done getting the remaining hosts for this loop 51243 1727204721.52422: getting the next task for host managed-node3 51243 1727204721.52432: done getting next task for host managed-node3 51243 1727204721.52439: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51243 1727204721.52443: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.52461: getting variables 51243 1727204721.52463: in VariableManager get_vars() 51243 1727204721.52524: Calling all_inventory to load vars for managed-node3 51243 1727204721.52527: Calling groups_inventory to load vars for managed-node3 51243 1727204721.52530: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.52548: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.52552: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.52555: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.53283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.53750: done with get_vars() 51243 1727204721.53768: done getting variables 51243 1727204721.54285: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.039) 0:00:04.150 ***** 51243 1727204721.54322: entering _queue_task() for managed-node3/dnf 51243 1727204721.54893: worker is 1 (out of 1 available) 51243 1727204721.54910: exiting _queue_task() for managed-node3/dnf 51243 1727204721.54922: done queuing things up, now waiting for results queue to drain 51243 1727204721.54924: waiting for pending results... 51243 1727204721.55562: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51243 1727204721.55639: in run() - task 127b8e07-fff9-5c5d-847b-00000000001c 51243 1727204721.55644: variable 'ansible_search_path' from source: unknown 51243 1727204721.55647: variable 'ansible_search_path' from source: unknown 51243 1727204721.55725: calling self._execute() 51243 1727204721.55756: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.55764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.56083: variable 'omit' from source: magic vars 51243 1727204721.56988: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.57029: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.57477: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.57481: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.57484: when evaluation is False, skipping this task 51243 1727204721.57487: _execute() done 51243 1727204721.57490: dumping result to json 51243 1727204721.57492: done dumping result, returning 51243 1727204721.57495: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-00000000001c] 51243 1727204721.57498: sending task result for task 127b8e07-fff9-5c5d-847b-00000000001c 51243 1727204721.57707: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000001c 51243 1727204721.57711: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.57825: no more pending results, returning what we have 51243 1727204721.57828: results queue empty 51243 1727204721.57830: checking for any_errors_fatal 51243 1727204721.57839: done checking for any_errors_fatal 51243 1727204721.57840: checking for max_fail_percentage 51243 1727204721.57842: done checking for max_fail_percentage 51243 1727204721.57842: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.57843: done checking to see if all hosts have failed 51243 1727204721.57844: getting the remaining hosts for this loop 51243 1727204721.57845: done getting the remaining hosts for this loop 51243 1727204721.57850: getting the next task for host managed-node3 51243 1727204721.57856: done getting next task for host managed-node3 51243 1727204721.57860: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51243 1727204721.57863: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.57881: getting variables 51243 1727204721.57883: in VariableManager get_vars() 51243 1727204721.57940: Calling all_inventory to load vars for managed-node3 51243 1727204721.57944: Calling groups_inventory to load vars for managed-node3 51243 1727204721.57946: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.57959: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.57962: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.58170: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.58647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.59117: done with get_vars() 51243 1727204721.59132: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51243 1727204721.59638: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.054) 0:00:04.205 ***** 51243 1727204721.59782: entering _queue_task() for managed-node3/yum 51243 1727204721.59784: Creating lock for yum 51243 1727204721.60463: worker is 1 (out of 1 available) 51243 1727204721.60583: exiting _queue_task() for managed-node3/yum 51243 1727204721.60597: done queuing things up, now waiting for results queue to drain 51243 1727204721.60599: waiting for pending results... 51243 1727204721.61231: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51243 1727204721.61374: in run() - task 127b8e07-fff9-5c5d-847b-00000000001d 51243 1727204721.61475: variable 'ansible_search_path' from source: unknown 51243 1727204721.61485: variable 'ansible_search_path' from source: unknown 51243 1727204721.61539: calling self._execute() 51243 1727204721.61640: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.61654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.61672: variable 'omit' from source: magic vars 51243 1727204721.62099: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.62118: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.62253: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.62264: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.62274: when evaluation is False, skipping this task 51243 1727204721.62281: _execute() done 51243 1727204721.62293: dumping result to json 51243 1727204721.62301: done dumping result, returning 51243 1727204721.62314: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-00000000001d] 51243 1727204721.62399: sending task result for task 127b8e07-fff9-5c5d-847b-00000000001d 51243 1727204721.62489: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000001d 51243 1727204721.62492: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.62562: no more pending results, returning what we have 51243 1727204721.62568: results queue empty 51243 1727204721.62569: checking for any_errors_fatal 51243 1727204721.62576: done checking for any_errors_fatal 51243 1727204721.62577: checking for max_fail_percentage 51243 1727204721.62579: done checking for max_fail_percentage 51243 1727204721.62580: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.62581: done checking to see if all hosts have failed 51243 1727204721.62582: getting the remaining hosts for this loop 51243 1727204721.62584: done getting the remaining hosts for this loop 51243 1727204721.62588: getting the next task for host managed-node3 51243 1727204721.62595: done getting next task for host managed-node3 51243 1727204721.62599: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51243 1727204721.62603: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.62620: getting variables 51243 1727204721.62622: in VariableManager get_vars() 51243 1727204721.62788: Calling all_inventory to load vars for managed-node3 51243 1727204721.62792: Calling groups_inventory to load vars for managed-node3 51243 1727204721.62794: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.62810: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.62814: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.62818: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.63201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.63421: done with get_vars() 51243 1727204721.63440: done getting variables 51243 1727204721.63515: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.037) 0:00:04.243 ***** 51243 1727204721.63574: entering _queue_task() for managed-node3/fail 51243 1727204721.64177: worker is 1 (out of 1 available) 51243 1727204721.64189: exiting _queue_task() for managed-node3/fail 51243 1727204721.64201: done queuing things up, now waiting for results queue to drain 51243 1727204721.64203: waiting for pending results... 51243 1727204721.64385: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51243 1727204721.64482: in run() - task 127b8e07-fff9-5c5d-847b-00000000001e 51243 1727204721.64506: variable 'ansible_search_path' from source: unknown 51243 1727204721.64514: variable 'ansible_search_path' from source: unknown 51243 1727204721.64567: calling self._execute() 51243 1727204721.64757: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.64762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.64764: variable 'omit' from source: magic vars 51243 1727204721.65240: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.65261: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.65405: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.65423: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.65430: when evaluation is False, skipping this task 51243 1727204721.65443: _execute() done 51243 1727204721.65450: dumping result to json 51243 1727204721.65457: done dumping result, returning 51243 1727204721.65473: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-00000000001e] 51243 1727204721.65484: sending task result for task 127b8e07-fff9-5c5d-847b-00000000001e skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.65688: no more pending results, returning what we have 51243 1727204721.65693: results queue empty 51243 1727204721.65694: checking for any_errors_fatal 51243 1727204721.65703: done checking for any_errors_fatal 51243 1727204721.65703: checking for max_fail_percentage 51243 1727204721.65705: done checking for max_fail_percentage 51243 1727204721.65706: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.65707: done checking to see if all hosts have failed 51243 1727204721.65708: getting the remaining hosts for this loop 51243 1727204721.65710: done getting the remaining hosts for this loop 51243 1727204721.65715: getting the next task for host managed-node3 51243 1727204721.65724: done getting next task for host managed-node3 51243 1727204721.65728: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 51243 1727204721.65732: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.65754: getting variables 51243 1727204721.65756: in VariableManager get_vars() 51243 1727204721.65817: Calling all_inventory to load vars for managed-node3 51243 1727204721.65821: Calling groups_inventory to load vars for managed-node3 51243 1727204721.65823: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.65840: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.65843: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.65846: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.66373: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000001e 51243 1727204721.66377: WORKER PROCESS EXITING 51243 1727204721.66404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.66659: done with get_vars() 51243 1727204721.66699: done getting variables 51243 1727204721.66798: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.032) 0:00:04.276 ***** 51243 1727204721.66839: entering _queue_task() for managed-node3/package 51243 1727204721.67208: worker is 1 (out of 1 available) 51243 1727204721.67222: exiting _queue_task() for managed-node3/package 51243 1727204721.67236: done queuing things up, now waiting for results queue to drain 51243 1727204721.67238: waiting for pending results... 51243 1727204721.67528: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 51243 1727204721.67693: in run() - task 127b8e07-fff9-5c5d-847b-00000000001f 51243 1727204721.67716: variable 'ansible_search_path' from source: unknown 51243 1727204721.67724: variable 'ansible_search_path' from source: unknown 51243 1727204721.67776: calling self._execute() 51243 1727204721.67877: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.67908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.67912: variable 'omit' from source: magic vars 51243 1727204721.68358: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.68429: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.68519: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.68536: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.68546: when evaluation is False, skipping this task 51243 1727204721.68556: _execute() done 51243 1727204721.68564: dumping result to json 51243 1727204721.68573: done dumping result, returning 51243 1727204721.68584: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-5c5d-847b-00000000001f] 51243 1727204721.68595: sending task result for task 127b8e07-fff9-5c5d-847b-00000000001f 51243 1727204721.68891: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000001f 51243 1727204721.68895: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.68957: no more pending results, returning what we have 51243 1727204721.68961: results queue empty 51243 1727204721.68962: checking for any_errors_fatal 51243 1727204721.69072: done checking for any_errors_fatal 51243 1727204721.69074: checking for max_fail_percentage 51243 1727204721.69079: done checking for max_fail_percentage 51243 1727204721.69081: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.69082: done checking to see if all hosts have failed 51243 1727204721.69082: getting the remaining hosts for this loop 51243 1727204721.69084: done getting the remaining hosts for this loop 51243 1727204721.69088: getting the next task for host managed-node3 51243 1727204721.69095: done getting next task for host managed-node3 51243 1727204721.69099: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51243 1727204721.69103: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.69119: getting variables 51243 1727204721.69120: in VariableManager get_vars() 51243 1727204721.69280: Calling all_inventory to load vars for managed-node3 51243 1727204721.69284: Calling groups_inventory to load vars for managed-node3 51243 1727204721.69287: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.69302: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.69305: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.69308: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.69524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.70017: done with get_vars() 51243 1727204721.70031: done getting variables 51243 1727204721.70306: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.035) 0:00:04.311 ***** 51243 1727204721.70348: entering _queue_task() for managed-node3/package 51243 1727204721.71049: worker is 1 (out of 1 available) 51243 1727204721.71068: exiting _queue_task() for managed-node3/package 51243 1727204721.71083: done queuing things up, now waiting for results queue to drain 51243 1727204721.71084: waiting for pending results... 51243 1727204721.71688: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51243 1727204721.72036: in run() - task 127b8e07-fff9-5c5d-847b-000000000020 51243 1727204721.72041: variable 'ansible_search_path' from source: unknown 51243 1727204721.72044: variable 'ansible_search_path' from source: unknown 51243 1727204721.72046: calling self._execute() 51243 1727204721.72138: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.72474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.72478: variable 'omit' from source: magic vars 51243 1727204721.73178: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.73198: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.73491: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.73504: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.73512: when evaluation is False, skipping this task 51243 1727204721.73519: _execute() done 51243 1727204721.73526: dumping result to json 51243 1727204721.73533: done dumping result, returning 51243 1727204721.73547: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-5c5d-847b-000000000020] 51243 1727204721.73564: sending task result for task 127b8e07-fff9-5c5d-847b-000000000020 51243 1727204721.73847: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000020 51243 1727204721.73851: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.73913: no more pending results, returning what we have 51243 1727204721.73917: results queue empty 51243 1727204721.73918: checking for any_errors_fatal 51243 1727204721.73925: done checking for any_errors_fatal 51243 1727204721.73926: checking for max_fail_percentage 51243 1727204721.73927: done checking for max_fail_percentage 51243 1727204721.73928: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.73929: done checking to see if all hosts have failed 51243 1727204721.73930: getting the remaining hosts for this loop 51243 1727204721.73931: done getting the remaining hosts for this loop 51243 1727204721.73938: getting the next task for host managed-node3 51243 1727204721.73946: done getting next task for host managed-node3 51243 1727204721.73950: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51243 1727204721.73953: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.73972: getting variables 51243 1727204721.73973: in VariableManager get_vars() 51243 1727204721.74028: Calling all_inventory to load vars for managed-node3 51243 1727204721.74031: Calling groups_inventory to load vars for managed-node3 51243 1727204721.74036: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.74050: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.74053: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.74056: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.74454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.75120: done with get_vars() 51243 1727204721.75139: done getting variables 51243 1727204721.75207: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.048) 0:00:04.360 ***** 51243 1727204721.75249: entering _queue_task() for managed-node3/package 51243 1727204721.76213: worker is 1 (out of 1 available) 51243 1727204721.76229: exiting _queue_task() for managed-node3/package 51243 1727204721.76245: done queuing things up, now waiting for results queue to drain 51243 1727204721.76247: waiting for pending results... 51243 1727204721.76660: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51243 1727204721.77075: in run() - task 127b8e07-fff9-5c5d-847b-000000000021 51243 1727204721.77080: variable 'ansible_search_path' from source: unknown 51243 1727204721.77083: variable 'ansible_search_path' from source: unknown 51243 1727204721.77272: calling self._execute() 51243 1727204721.77354: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.77429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.77447: variable 'omit' from source: magic vars 51243 1727204721.78688: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.78709: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.79113: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.79117: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.79120: when evaluation is False, skipping this task 51243 1727204721.79123: _execute() done 51243 1727204721.79126: dumping result to json 51243 1727204721.79128: done dumping result, returning 51243 1727204721.79136: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-5c5d-847b-000000000021] 51243 1727204721.79150: sending task result for task 127b8e07-fff9-5c5d-847b-000000000021 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.79526: no more pending results, returning what we have 51243 1727204721.79531: results queue empty 51243 1727204721.79535: checking for any_errors_fatal 51243 1727204721.79543: done checking for any_errors_fatal 51243 1727204721.79544: checking for max_fail_percentage 51243 1727204721.79546: done checking for max_fail_percentage 51243 1727204721.79547: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.79548: done checking to see if all hosts have failed 51243 1727204721.79548: getting the remaining hosts for this loop 51243 1727204721.79550: done getting the remaining hosts for this loop 51243 1727204721.79555: getting the next task for host managed-node3 51243 1727204721.79564: done getting next task for host managed-node3 51243 1727204721.79571: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51243 1727204721.79575: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.79592: getting variables 51243 1727204721.79594: in VariableManager get_vars() 51243 1727204721.79655: Calling all_inventory to load vars for managed-node3 51243 1727204721.79658: Calling groups_inventory to load vars for managed-node3 51243 1727204721.79661: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.80019: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.80024: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.80031: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000021 51243 1727204721.80037: WORKER PROCESS EXITING 51243 1727204721.80041: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.80454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.80714: done with get_vars() 51243 1727204721.80728: done getting variables 51243 1727204721.80844: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.056) 0:00:04.416 ***** 51243 1727204721.80886: entering _queue_task() for managed-node3/service 51243 1727204721.80889: Creating lock for service 51243 1727204721.81322: worker is 1 (out of 1 available) 51243 1727204721.81339: exiting _queue_task() for managed-node3/service 51243 1727204721.81353: done queuing things up, now waiting for results queue to drain 51243 1727204721.81355: waiting for pending results... 51243 1727204721.81606: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51243 1727204721.81768: in run() - task 127b8e07-fff9-5c5d-847b-000000000022 51243 1727204721.81793: variable 'ansible_search_path' from source: unknown 51243 1727204721.81803: variable 'ansible_search_path' from source: unknown 51243 1727204721.81850: calling self._execute() 51243 1727204721.81973: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.81979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.81985: variable 'omit' from source: magic vars 51243 1727204721.83074: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.83079: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.83172: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.83371: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.83375: when evaluation is False, skipping this task 51243 1727204721.83378: _execute() done 51243 1727204721.83380: dumping result to json 51243 1727204721.83383: done dumping result, returning 51243 1727204721.83386: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-000000000022] 51243 1727204721.83441: sending task result for task 127b8e07-fff9-5c5d-847b-000000000022 51243 1727204721.83526: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000022 51243 1727204721.83530: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.83599: no more pending results, returning what we have 51243 1727204721.83603: results queue empty 51243 1727204721.83604: checking for any_errors_fatal 51243 1727204721.83614: done checking for any_errors_fatal 51243 1727204721.83615: checking for max_fail_percentage 51243 1727204721.83617: done checking for max_fail_percentage 51243 1727204721.83618: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.83619: done checking to see if all hosts have failed 51243 1727204721.83620: getting the remaining hosts for this loop 51243 1727204721.83621: done getting the remaining hosts for this loop 51243 1727204721.83626: getting the next task for host managed-node3 51243 1727204721.83636: done getting next task for host managed-node3 51243 1727204721.83641: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51243 1727204721.83645: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.83661: getting variables 51243 1727204721.83663: in VariableManager get_vars() 51243 1727204721.83843: Calling all_inventory to load vars for managed-node3 51243 1727204721.83847: Calling groups_inventory to load vars for managed-node3 51243 1727204721.83849: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.83864: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.84003: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.84008: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.84335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.84600: done with get_vars() 51243 1727204721.84613: done getting variables 51243 1727204721.84701: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.038) 0:00:04.455 ***** 51243 1727204721.84741: entering _queue_task() for managed-node3/service 51243 1727204721.85589: worker is 1 (out of 1 available) 51243 1727204721.85606: exiting _queue_task() for managed-node3/service 51243 1727204721.85621: done queuing things up, now waiting for results queue to drain 51243 1727204721.85623: waiting for pending results... 51243 1727204721.86587: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51243 1727204721.87159: in run() - task 127b8e07-fff9-5c5d-847b-000000000023 51243 1727204721.87164: variable 'ansible_search_path' from source: unknown 51243 1727204721.87169: variable 'ansible_search_path' from source: unknown 51243 1727204721.87173: calling self._execute() 51243 1727204721.87562: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.87580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.87611: variable 'omit' from source: magic vars 51243 1727204721.88204: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.88224: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.88571: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.88577: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.88579: when evaluation is False, skipping this task 51243 1727204721.88583: _execute() done 51243 1727204721.88585: dumping result to json 51243 1727204721.88588: done dumping result, returning 51243 1727204721.88591: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-5c5d-847b-000000000023] 51243 1727204721.88593: sending task result for task 127b8e07-fff9-5c5d-847b-000000000023 51243 1727204721.88674: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000023 51243 1727204721.88678: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51243 1727204721.88729: no more pending results, returning what we have 51243 1727204721.88732: results queue empty 51243 1727204721.88736: checking for any_errors_fatal 51243 1727204721.88743: done checking for any_errors_fatal 51243 1727204721.88744: checking for max_fail_percentage 51243 1727204721.88746: done checking for max_fail_percentage 51243 1727204721.88747: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.88748: done checking to see if all hosts have failed 51243 1727204721.88749: getting the remaining hosts for this loop 51243 1727204721.88751: done getting the remaining hosts for this loop 51243 1727204721.88756: getting the next task for host managed-node3 51243 1727204721.88765: done getting next task for host managed-node3 51243 1727204721.88775: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51243 1727204721.88779: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.88795: getting variables 51243 1727204721.88797: in VariableManager get_vars() 51243 1727204721.88856: Calling all_inventory to load vars for managed-node3 51243 1727204721.88859: Calling groups_inventory to load vars for managed-node3 51243 1727204721.88862: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.88978: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.88982: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.88986: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.89357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.89714: done with get_vars() 51243 1727204721.89727: done getting variables 51243 1727204721.89996: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.052) 0:00:04.508 ***** 51243 1727204721.90037: entering _queue_task() for managed-node3/service 51243 1727204721.90842: worker is 1 (out of 1 available) 51243 1727204721.90859: exiting _queue_task() for managed-node3/service 51243 1727204721.90875: done queuing things up, now waiting for results queue to drain 51243 1727204721.90970: waiting for pending results... 51243 1727204721.91695: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51243 1727204721.92182: in run() - task 127b8e07-fff9-5c5d-847b-000000000024 51243 1727204721.92187: variable 'ansible_search_path' from source: unknown 51243 1727204721.92191: variable 'ansible_search_path' from source: unknown 51243 1727204721.92378: calling self._execute() 51243 1727204721.92519: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.92538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.92556: variable 'omit' from source: magic vars 51243 1727204721.93540: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.93618: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.93794: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.93807: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.93823: when evaluation is False, skipping this task 51243 1727204721.93835: _execute() done 51243 1727204721.93847: dumping result to json 51243 1727204721.93855: done dumping result, returning 51243 1727204721.93871: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-5c5d-847b-000000000024] 51243 1727204721.93883: sending task result for task 127b8e07-fff9-5c5d-847b-000000000024 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204721.94065: no more pending results, returning what we have 51243 1727204721.94071: results queue empty 51243 1727204721.94072: checking for any_errors_fatal 51243 1727204721.94079: done checking for any_errors_fatal 51243 1727204721.94080: checking for max_fail_percentage 51243 1727204721.94081: done checking for max_fail_percentage 51243 1727204721.94082: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.94083: done checking to see if all hosts have failed 51243 1727204721.94083: getting the remaining hosts for this loop 51243 1727204721.94085: done getting the remaining hosts for this loop 51243 1727204721.94090: getting the next task for host managed-node3 51243 1727204721.94097: done getting next task for host managed-node3 51243 1727204721.94103: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 51243 1727204721.94106: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.94121: getting variables 51243 1727204721.94123: in VariableManager get_vars() 51243 1727204721.94356: Calling all_inventory to load vars for managed-node3 51243 1727204721.94367: Calling groups_inventory to load vars for managed-node3 51243 1727204721.94371: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.94379: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000024 51243 1727204721.94385: WORKER PROCESS EXITING 51243 1727204721.94396: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.94399: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.94403: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.94627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.95052: done with get_vars() 51243 1727204721.95068: done getting variables 51243 1727204721.95258: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.052) 0:00:04.560 ***** 51243 1727204721.95294: entering _queue_task() for managed-node3/service 51243 1727204721.96498: worker is 1 (out of 1 available) 51243 1727204721.96512: exiting _queue_task() for managed-node3/service 51243 1727204721.96642: done queuing things up, now waiting for results queue to drain 51243 1727204721.96645: waiting for pending results... 51243 1727204721.96795: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 51243 1727204721.97240: in run() - task 127b8e07-fff9-5c5d-847b-000000000025 51243 1727204721.97244: variable 'ansible_search_path' from source: unknown 51243 1727204721.97248: variable 'ansible_search_path' from source: unknown 51243 1727204721.97362: calling self._execute() 51243 1727204721.97669: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204721.97676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204721.97679: variable 'omit' from source: magic vars 51243 1727204721.98124: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.98145: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204721.98280: variable 'ansible_distribution_major_version' from source: facts 51243 1727204721.98294: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204721.98302: when evaluation is False, skipping this task 51243 1727204721.98310: _execute() done 51243 1727204721.98317: dumping result to json 51243 1727204721.98330: done dumping result, returning 51243 1727204721.98345: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-5c5d-847b-000000000025] 51243 1727204721.98357: sending task result for task 127b8e07-fff9-5c5d-847b-000000000025 51243 1727204721.98593: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000025 51243 1727204721.98597: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51243 1727204721.98652: no more pending results, returning what we have 51243 1727204721.98656: results queue empty 51243 1727204721.98658: checking for any_errors_fatal 51243 1727204721.98669: done checking for any_errors_fatal 51243 1727204721.98670: checking for max_fail_percentage 51243 1727204721.98671: done checking for max_fail_percentage 51243 1727204721.98672: checking to see if all hosts have failed and the running result is not ok 51243 1727204721.98673: done checking to see if all hosts have failed 51243 1727204721.98674: getting the remaining hosts for this loop 51243 1727204721.98676: done getting the remaining hosts for this loop 51243 1727204721.98681: getting the next task for host managed-node3 51243 1727204721.98689: done getting next task for host managed-node3 51243 1727204721.98694: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51243 1727204721.98697: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204721.98716: getting variables 51243 1727204721.98719: in VariableManager get_vars() 51243 1727204721.98884: Calling all_inventory to load vars for managed-node3 51243 1727204721.98887: Calling groups_inventory to load vars for managed-node3 51243 1727204721.98890: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204721.98903: Calling all_plugins_play to load vars for managed-node3 51243 1727204721.98906: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204721.98909: Calling groups_plugins_play to load vars for managed-node3 51243 1727204721.99342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204721.99880: done with get_vars() 51243 1727204721.99896: done getting variables 51243 1727204722.00104: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:21 -0400 (0:00:00.048) 0:00:04.609 ***** 51243 1727204722.00146: entering _queue_task() for managed-node3/copy 51243 1727204722.00711: worker is 1 (out of 1 available) 51243 1727204722.00726: exiting _queue_task() for managed-node3/copy 51243 1727204722.00873: done queuing things up, now waiting for results queue to drain 51243 1727204722.00876: waiting for pending results... 51243 1727204722.01612: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51243 1727204722.01695: in run() - task 127b8e07-fff9-5c5d-847b-000000000026 51243 1727204722.01721: variable 'ansible_search_path' from source: unknown 51243 1727204722.01879: variable 'ansible_search_path' from source: unknown 51243 1727204722.01883: calling self._execute() 51243 1727204722.02076: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.02090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.02106: variable 'omit' from source: magic vars 51243 1727204722.03017: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.03042: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.03205: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.03221: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.03229: when evaluation is False, skipping this task 51243 1727204722.03240: _execute() done 51243 1727204722.03247: dumping result to json 51243 1727204722.03255: done dumping result, returning 51243 1727204722.03274: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-5c5d-847b-000000000026] 51243 1727204722.03285: sending task result for task 127b8e07-fff9-5c5d-847b-000000000026 51243 1727204722.03471: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000026 51243 1727204722.03474: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.03544: no more pending results, returning what we have 51243 1727204722.03548: results queue empty 51243 1727204722.03550: checking for any_errors_fatal 51243 1727204722.03556: done checking for any_errors_fatal 51243 1727204722.03557: checking for max_fail_percentage 51243 1727204722.03559: done checking for max_fail_percentage 51243 1727204722.03560: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.03561: done checking to see if all hosts have failed 51243 1727204722.03562: getting the remaining hosts for this loop 51243 1727204722.03564: done getting the remaining hosts for this loop 51243 1727204722.03571: getting the next task for host managed-node3 51243 1727204722.03578: done getting next task for host managed-node3 51243 1727204722.03583: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51243 1727204722.03587: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.03605: getting variables 51243 1727204722.03607: in VariableManager get_vars() 51243 1727204722.03784: Calling all_inventory to load vars for managed-node3 51243 1727204722.03787: Calling groups_inventory to load vars for managed-node3 51243 1727204722.03791: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.03804: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.03807: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.03810: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.04324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.04594: done with get_vars() 51243 1727204722.04608: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.045) 0:00:04.654 ***** 51243 1727204722.04710: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 51243 1727204722.04712: Creating lock for fedora.linux_system_roles.network_connections 51243 1727204722.05335: worker is 1 (out of 1 available) 51243 1727204722.05351: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 51243 1727204722.05364: done queuing things up, now waiting for results queue to drain 51243 1727204722.05515: waiting for pending results... 51243 1727204722.05950: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51243 1727204722.06205: in run() - task 127b8e07-fff9-5c5d-847b-000000000027 51243 1727204722.06226: variable 'ansible_search_path' from source: unknown 51243 1727204722.06389: variable 'ansible_search_path' from source: unknown 51243 1727204722.06431: calling self._execute() 51243 1727204722.06774: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.06780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.06782: variable 'omit' from source: magic vars 51243 1727204722.07495: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.07557: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.07701: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.07714: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.07721: when evaluation is False, skipping this task 51243 1727204722.07728: _execute() done 51243 1727204722.07734: dumping result to json 51243 1727204722.07740: done dumping result, returning 51243 1727204722.07758: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-5c5d-847b-000000000027] 51243 1727204722.07771: sending task result for task 127b8e07-fff9-5c5d-847b-000000000027 51243 1727204722.07912: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000027 51243 1727204722.07919: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.08121: no more pending results, returning what we have 51243 1727204722.08124: results queue empty 51243 1727204722.08125: checking for any_errors_fatal 51243 1727204722.08130: done checking for any_errors_fatal 51243 1727204722.08131: checking for max_fail_percentage 51243 1727204722.08135: done checking for max_fail_percentage 51243 1727204722.08136: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.08137: done checking to see if all hosts have failed 51243 1727204722.08137: getting the remaining hosts for this loop 51243 1727204722.08139: done getting the remaining hosts for this loop 51243 1727204722.08142: getting the next task for host managed-node3 51243 1727204722.08148: done getting next task for host managed-node3 51243 1727204722.08152: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 51243 1727204722.08154: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.08171: getting variables 51243 1727204722.08173: in VariableManager get_vars() 51243 1727204722.08219: Calling all_inventory to load vars for managed-node3 51243 1727204722.08222: Calling groups_inventory to load vars for managed-node3 51243 1727204722.08224: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.08248: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.08251: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.08255: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.08549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.08823: done with get_vars() 51243 1727204722.08840: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.042) 0:00:04.697 ***** 51243 1727204722.08947: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 51243 1727204722.08949: Creating lock for fedora.linux_system_roles.network_state 51243 1727204722.09385: worker is 1 (out of 1 available) 51243 1727204722.09399: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 51243 1727204722.09413: done queuing things up, now waiting for results queue to drain 51243 1727204722.09414: waiting for pending results... 51243 1727204722.09685: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 51243 1727204722.09877: in run() - task 127b8e07-fff9-5c5d-847b-000000000028 51243 1727204722.09881: variable 'ansible_search_path' from source: unknown 51243 1727204722.09884: variable 'ansible_search_path' from source: unknown 51243 1727204722.09971: calling self._execute() 51243 1727204722.10030: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.10047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.10063: variable 'omit' from source: magic vars 51243 1727204722.10689: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.10711: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.10881: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.10959: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.10964: when evaluation is False, skipping this task 51243 1727204722.10967: _execute() done 51243 1727204722.10970: dumping result to json 51243 1727204722.10972: done dumping result, returning 51243 1727204722.10977: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-5c5d-847b-000000000028] 51243 1727204722.10980: sending task result for task 127b8e07-fff9-5c5d-847b-000000000028 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.11136: no more pending results, returning what we have 51243 1727204722.11140: results queue empty 51243 1727204722.11142: checking for any_errors_fatal 51243 1727204722.11151: done checking for any_errors_fatal 51243 1727204722.11152: checking for max_fail_percentage 51243 1727204722.11154: done checking for max_fail_percentage 51243 1727204722.11155: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.11155: done checking to see if all hosts have failed 51243 1727204722.11156: getting the remaining hosts for this loop 51243 1727204722.11158: done getting the remaining hosts for this loop 51243 1727204722.11164: getting the next task for host managed-node3 51243 1727204722.11174: done getting next task for host managed-node3 51243 1727204722.11179: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51243 1727204722.11182: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.11203: getting variables 51243 1727204722.11205: in VariableManager get_vars() 51243 1727204722.11489: Calling all_inventory to load vars for managed-node3 51243 1727204722.11493: Calling groups_inventory to load vars for managed-node3 51243 1727204722.11496: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.11504: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000028 51243 1727204722.11507: WORKER PROCESS EXITING 51243 1727204722.11520: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.11523: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.11527: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.11923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.12185: done with get_vars() 51243 1727204722.12198: done getting variables 51243 1727204722.12278: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.033) 0:00:04.730 ***** 51243 1727204722.12314: entering _queue_task() for managed-node3/debug 51243 1727204722.12905: worker is 1 (out of 1 available) 51243 1727204722.12918: exiting _queue_task() for managed-node3/debug 51243 1727204722.12930: done queuing things up, now waiting for results queue to drain 51243 1727204722.12931: waiting for pending results... 51243 1727204722.13064: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51243 1727204722.13251: in run() - task 127b8e07-fff9-5c5d-847b-000000000029 51243 1727204722.13284: variable 'ansible_search_path' from source: unknown 51243 1727204722.13298: variable 'ansible_search_path' from source: unknown 51243 1727204722.13374: calling self._execute() 51243 1727204722.13454: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.13470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.13511: variable 'omit' from source: magic vars 51243 1727204722.13943: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.13971: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.14137: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.14140: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.14143: when evaluation is False, skipping this task 51243 1727204722.14145: _execute() done 51243 1727204722.14148: dumping result to json 51243 1727204722.14151: done dumping result, returning 51243 1727204722.14176: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-5c5d-847b-000000000029] 51243 1727204722.14179: sending task result for task 127b8e07-fff9-5c5d-847b-000000000029 51243 1727204722.14321: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000029 51243 1727204722.14325: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204722.14406: no more pending results, returning what we have 51243 1727204722.14411: results queue empty 51243 1727204722.14412: checking for any_errors_fatal 51243 1727204722.14420: done checking for any_errors_fatal 51243 1727204722.14421: checking for max_fail_percentage 51243 1727204722.14423: done checking for max_fail_percentage 51243 1727204722.14424: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.14424: done checking to see if all hosts have failed 51243 1727204722.14425: getting the remaining hosts for this loop 51243 1727204722.14427: done getting the remaining hosts for this loop 51243 1727204722.14432: getting the next task for host managed-node3 51243 1727204722.14443: done getting next task for host managed-node3 51243 1727204722.14448: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51243 1727204722.14452: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.14585: getting variables 51243 1727204722.14588: in VariableManager get_vars() 51243 1727204722.14652: Calling all_inventory to load vars for managed-node3 51243 1727204722.14655: Calling groups_inventory to load vars for managed-node3 51243 1727204722.14658: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.14804: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.14809: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.14813: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.15151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.15475: done with get_vars() 51243 1727204722.15488: done getting variables 51243 1727204722.15558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.032) 0:00:04.763 ***** 51243 1727204722.15595: entering _queue_task() for managed-node3/debug 51243 1727204722.15952: worker is 1 (out of 1 available) 51243 1727204722.16072: exiting _queue_task() for managed-node3/debug 51243 1727204722.16088: done queuing things up, now waiting for results queue to drain 51243 1727204722.16090: waiting for pending results... 51243 1727204722.16435: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51243 1727204722.16482: in run() - task 127b8e07-fff9-5c5d-847b-00000000002a 51243 1727204722.16511: variable 'ansible_search_path' from source: unknown 51243 1727204722.16599: variable 'ansible_search_path' from source: unknown 51243 1727204722.16603: calling self._execute() 51243 1727204722.16693: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.16707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.16727: variable 'omit' from source: magic vars 51243 1727204722.17185: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.17206: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.17349: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.17388: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.17395: when evaluation is False, skipping this task 51243 1727204722.17399: _execute() done 51243 1727204722.17402: dumping result to json 51243 1727204722.17405: done dumping result, returning 51243 1727204722.17411: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-5c5d-847b-00000000002a] 51243 1727204722.17470: sending task result for task 127b8e07-fff9-5c5d-847b-00000000002a skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204722.17663: no more pending results, returning what we have 51243 1727204722.17669: results queue empty 51243 1727204722.17671: checking for any_errors_fatal 51243 1727204722.17677: done checking for any_errors_fatal 51243 1727204722.17678: checking for max_fail_percentage 51243 1727204722.17680: done checking for max_fail_percentage 51243 1727204722.17681: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.17682: done checking to see if all hosts have failed 51243 1727204722.17683: getting the remaining hosts for this loop 51243 1727204722.17685: done getting the remaining hosts for this loop 51243 1727204722.17690: getting the next task for host managed-node3 51243 1727204722.17699: done getting next task for host managed-node3 51243 1727204722.17703: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51243 1727204722.17707: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.17843: getting variables 51243 1727204722.17846: in VariableManager get_vars() 51243 1727204722.17904: Calling all_inventory to load vars for managed-node3 51243 1727204722.17907: Calling groups_inventory to load vars for managed-node3 51243 1727204722.17910: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.17923: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.17926: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.17931: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.18059: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000002a 51243 1727204722.18062: WORKER PROCESS EXITING 51243 1727204722.18308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.18564: done with get_vars() 51243 1727204722.18579: done getting variables 51243 1727204722.18650: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.030) 0:00:04.794 ***** 51243 1727204722.18690: entering _queue_task() for managed-node3/debug 51243 1727204722.19079: worker is 1 (out of 1 available) 51243 1727204722.19093: exiting _queue_task() for managed-node3/debug 51243 1727204722.19106: done queuing things up, now waiting for results queue to drain 51243 1727204722.19108: waiting for pending results... 51243 1727204722.19378: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51243 1727204722.19527: in run() - task 127b8e07-fff9-5c5d-847b-00000000002b 51243 1727204722.19554: variable 'ansible_search_path' from source: unknown 51243 1727204722.19566: variable 'ansible_search_path' from source: unknown 51243 1727204722.19626: calling self._execute() 51243 1727204722.19726: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.19743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.19759: variable 'omit' from source: magic vars 51243 1727204722.20224: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.20247: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.20391: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.20404: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.20431: when evaluation is False, skipping this task 51243 1727204722.20438: _execute() done 51243 1727204722.20441: dumping result to json 51243 1727204722.20443: done dumping result, returning 51243 1727204722.20542: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-5c5d-847b-00000000002b] 51243 1727204722.20546: sending task result for task 127b8e07-fff9-5c5d-847b-00000000002b 51243 1727204722.20627: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000002b 51243 1727204722.20631: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204722.20693: no more pending results, returning what we have 51243 1727204722.20697: results queue empty 51243 1727204722.20698: checking for any_errors_fatal 51243 1727204722.20705: done checking for any_errors_fatal 51243 1727204722.20706: checking for max_fail_percentage 51243 1727204722.20707: done checking for max_fail_percentage 51243 1727204722.20708: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.20709: done checking to see if all hosts have failed 51243 1727204722.20710: getting the remaining hosts for this loop 51243 1727204722.20712: done getting the remaining hosts for this loop 51243 1727204722.20717: getting the next task for host managed-node3 51243 1727204722.20725: done getting next task for host managed-node3 51243 1727204722.20730: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 51243 1727204722.20737: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.20754: getting variables 51243 1727204722.20756: in VariableManager get_vars() 51243 1727204722.20818: Calling all_inventory to load vars for managed-node3 51243 1727204722.20822: Calling groups_inventory to load vars for managed-node3 51243 1727204722.20825: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.20842: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.20846: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.20849: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.21654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.21911: done with get_vars() 51243 1727204722.21923: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.033) 0:00:04.828 ***** 51243 1727204722.22041: entering _queue_task() for managed-node3/ping 51243 1727204722.22043: Creating lock for ping 51243 1727204722.22513: worker is 1 (out of 1 available) 51243 1727204722.22526: exiting _queue_task() for managed-node3/ping 51243 1727204722.22540: done queuing things up, now waiting for results queue to drain 51243 1727204722.22542: waiting for pending results... 51243 1727204722.22847: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 51243 1727204722.22941: in run() - task 127b8e07-fff9-5c5d-847b-00000000002c 51243 1727204722.23046: variable 'ansible_search_path' from source: unknown 51243 1727204722.23050: variable 'ansible_search_path' from source: unknown 51243 1727204722.23053: calling self._execute() 51243 1727204722.23098: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.23112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.23128: variable 'omit' from source: magic vars 51243 1727204722.23575: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.23602: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.23739: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.23752: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.23760: when evaluation is False, skipping this task 51243 1727204722.23770: _execute() done 51243 1727204722.23778: dumping result to json 51243 1727204722.23786: done dumping result, returning 51243 1727204722.23798: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-5c5d-847b-00000000002c] 51243 1727204722.23815: sending task result for task 127b8e07-fff9-5c5d-847b-00000000002c skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.24086: no more pending results, returning what we have 51243 1727204722.24091: results queue empty 51243 1727204722.24092: checking for any_errors_fatal 51243 1727204722.24101: done checking for any_errors_fatal 51243 1727204722.24102: checking for max_fail_percentage 51243 1727204722.24104: done checking for max_fail_percentage 51243 1727204722.24104: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.24106: done checking to see if all hosts have failed 51243 1727204722.24107: getting the remaining hosts for this loop 51243 1727204722.24109: done getting the remaining hosts for this loop 51243 1727204722.24114: getting the next task for host managed-node3 51243 1727204722.24124: done getting next task for host managed-node3 51243 1727204722.24128: ^ task is: TASK: meta (role_complete) 51243 1727204722.24139: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.24158: getting variables 51243 1727204722.24160: in VariableManager get_vars() 51243 1727204722.24222: Calling all_inventory to load vars for managed-node3 51243 1727204722.24225: Calling groups_inventory to load vars for managed-node3 51243 1727204722.24228: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.24362: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.24368: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.24374: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000002c 51243 1727204722.24376: WORKER PROCESS EXITING 51243 1727204722.24381: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.24702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.24978: done with get_vars() 51243 1727204722.24996: done getting variables 51243 1727204722.25084: done queuing things up, now waiting for results queue to drain 51243 1727204722.25086: results queue empty 51243 1727204722.25087: checking for any_errors_fatal 51243 1727204722.25090: done checking for any_errors_fatal 51243 1727204722.25091: checking for max_fail_percentage 51243 1727204722.25092: done checking for max_fail_percentage 51243 1727204722.25093: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.25094: done checking to see if all hosts have failed 51243 1727204722.25095: getting the remaining hosts for this loop 51243 1727204722.25096: done getting the remaining hosts for this loop 51243 1727204722.25098: getting the next task for host managed-node3 51243 1727204722.25108: done getting next task for host managed-node3 51243 1727204722.25111: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51243 1727204722.25114: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.25124: getting variables 51243 1727204722.25125: in VariableManager get_vars() 51243 1727204722.25150: Calling all_inventory to load vars for managed-node3 51243 1727204722.25153: Calling groups_inventory to load vars for managed-node3 51243 1727204722.25155: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.25160: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.25162: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.25167: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.25371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.25623: done with get_vars() 51243 1727204722.25636: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.036) 0:00:04.865 ***** 51243 1727204722.25725: entering _queue_task() for managed-node3/include_tasks 51243 1727204722.26231: worker is 1 (out of 1 available) 51243 1727204722.26247: exiting _queue_task() for managed-node3/include_tasks 51243 1727204722.26260: done queuing things up, now waiting for results queue to drain 51243 1727204722.26262: waiting for pending results... 51243 1727204722.26480: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51243 1727204722.26583: in run() - task 127b8e07-fff9-5c5d-847b-000000000063 51243 1727204722.26596: variable 'ansible_search_path' from source: unknown 51243 1727204722.26604: variable 'ansible_search_path' from source: unknown 51243 1727204722.26642: calling self._execute() 51243 1727204722.26711: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.26720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.26725: variable 'omit' from source: magic vars 51243 1727204722.27042: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.27054: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.27141: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.27145: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.27148: when evaluation is False, skipping this task 51243 1727204722.27150: _execute() done 51243 1727204722.27155: dumping result to json 51243 1727204722.27157: done dumping result, returning 51243 1727204722.27167: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-5c5d-847b-000000000063] 51243 1727204722.27176: sending task result for task 127b8e07-fff9-5c5d-847b-000000000063 51243 1727204722.27270: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000063 51243 1727204722.27272: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.27325: no more pending results, returning what we have 51243 1727204722.27328: results queue empty 51243 1727204722.27329: checking for any_errors_fatal 51243 1727204722.27331: done checking for any_errors_fatal 51243 1727204722.27331: checking for max_fail_percentage 51243 1727204722.27334: done checking for max_fail_percentage 51243 1727204722.27335: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.27336: done checking to see if all hosts have failed 51243 1727204722.27336: getting the remaining hosts for this loop 51243 1727204722.27339: done getting the remaining hosts for this loop 51243 1727204722.27343: getting the next task for host managed-node3 51243 1727204722.27351: done getting next task for host managed-node3 51243 1727204722.27355: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 51243 1727204722.27358: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.27379: getting variables 51243 1727204722.27381: in VariableManager get_vars() 51243 1727204722.27426: Calling all_inventory to load vars for managed-node3 51243 1727204722.27428: Calling groups_inventory to load vars for managed-node3 51243 1727204722.27430: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.27441: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.27443: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.27446: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.27632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.27902: done with get_vars() 51243 1727204722.27927: done getting variables 51243 1727204722.27998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.023) 0:00:04.888 ***** 51243 1727204722.28048: entering _queue_task() for managed-node3/debug 51243 1727204722.28740: worker is 1 (out of 1 available) 51243 1727204722.28754: exiting _queue_task() for managed-node3/debug 51243 1727204722.28819: done queuing things up, now waiting for results queue to drain 51243 1727204722.28821: waiting for pending results... 51243 1727204722.29027: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 51243 1727204722.29186: in run() - task 127b8e07-fff9-5c5d-847b-000000000064 51243 1727204722.29209: variable 'ansible_search_path' from source: unknown 51243 1727204722.29219: variable 'ansible_search_path' from source: unknown 51243 1727204722.29283: calling self._execute() 51243 1727204722.29395: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.29407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.29423: variable 'omit' from source: magic vars 51243 1727204722.29897: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.29994: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.30055: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.30069: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.30076: when evaluation is False, skipping this task 51243 1727204722.30084: _execute() done 51243 1727204722.30091: dumping result to json 51243 1727204722.30107: done dumping result, returning 51243 1727204722.30128: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-5c5d-847b-000000000064] 51243 1727204722.30171: sending task result for task 127b8e07-fff9-5c5d-847b-000000000064 51243 1727204722.30399: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000064 51243 1727204722.30403: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204722.30468: no more pending results, returning what we have 51243 1727204722.30472: results queue empty 51243 1727204722.30473: checking for any_errors_fatal 51243 1727204722.30479: done checking for any_errors_fatal 51243 1727204722.30480: checking for max_fail_percentage 51243 1727204722.30482: done checking for max_fail_percentage 51243 1727204722.30482: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.30483: done checking to see if all hosts have failed 51243 1727204722.30484: getting the remaining hosts for this loop 51243 1727204722.30487: done getting the remaining hosts for this loop 51243 1727204722.30491: getting the next task for host managed-node3 51243 1727204722.30500: done getting next task for host managed-node3 51243 1727204722.30504: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51243 1727204722.30508: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.30530: getting variables 51243 1727204722.30532: in VariableManager get_vars() 51243 1727204722.30718: Calling all_inventory to load vars for managed-node3 51243 1727204722.30721: Calling groups_inventory to load vars for managed-node3 51243 1727204722.30723: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.30737: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.30740: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.30744: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.31163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.31402: done with get_vars() 51243 1727204722.31414: done getting variables 51243 1727204722.31492: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.034) 0:00:04.923 ***** 51243 1727204722.31529: entering _queue_task() for managed-node3/fail 51243 1727204722.32022: worker is 1 (out of 1 available) 51243 1727204722.32036: exiting _queue_task() for managed-node3/fail 51243 1727204722.32049: done queuing things up, now waiting for results queue to drain 51243 1727204722.32050: waiting for pending results... 51243 1727204722.32344: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51243 1727204722.32473: in run() - task 127b8e07-fff9-5c5d-847b-000000000065 51243 1727204722.32477: variable 'ansible_search_path' from source: unknown 51243 1727204722.32494: variable 'ansible_search_path' from source: unknown 51243 1727204722.32573: calling self._execute() 51243 1727204722.32696: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.32726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.32826: variable 'omit' from source: magic vars 51243 1727204722.33511: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.33516: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.33849: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.33921: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.33930: when evaluation is False, skipping this task 51243 1727204722.33937: _execute() done 51243 1727204722.33991: dumping result to json 51243 1727204722.33995: done dumping result, returning 51243 1727204722.33998: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-5c5d-847b-000000000065] 51243 1727204722.34001: sending task result for task 127b8e07-fff9-5c5d-847b-000000000065 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.34408: no more pending results, returning what we have 51243 1727204722.34411: results queue empty 51243 1727204722.34412: checking for any_errors_fatal 51243 1727204722.34418: done checking for any_errors_fatal 51243 1727204722.34419: checking for max_fail_percentage 51243 1727204722.34421: done checking for max_fail_percentage 51243 1727204722.34421: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.34422: done checking to see if all hosts have failed 51243 1727204722.34423: getting the remaining hosts for this loop 51243 1727204722.34425: done getting the remaining hosts for this loop 51243 1727204722.34429: getting the next task for host managed-node3 51243 1727204722.34436: done getting next task for host managed-node3 51243 1727204722.34440: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51243 1727204722.34443: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.34511: getting variables 51243 1727204722.34513: in VariableManager get_vars() 51243 1727204722.34561: Calling all_inventory to load vars for managed-node3 51243 1727204722.34564: Calling groups_inventory to load vars for managed-node3 51243 1727204722.34589: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.34603: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.34606: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.34610: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.34854: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000065 51243 1727204722.34860: WORKER PROCESS EXITING 51243 1727204722.34881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.35037: done with get_vars() 51243 1727204722.35048: done getting variables 51243 1727204722.35111: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.036) 0:00:04.959 ***** 51243 1727204722.35145: entering _queue_task() for managed-node3/fail 51243 1727204722.35409: worker is 1 (out of 1 available) 51243 1727204722.35423: exiting _queue_task() for managed-node3/fail 51243 1727204722.35438: done queuing things up, now waiting for results queue to drain 51243 1727204722.35440: waiting for pending results... 51243 1727204722.35631: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51243 1727204722.35731: in run() - task 127b8e07-fff9-5c5d-847b-000000000066 51243 1727204722.35746: variable 'ansible_search_path' from source: unknown 51243 1727204722.35750: variable 'ansible_search_path' from source: unknown 51243 1727204722.35789: calling self._execute() 51243 1727204722.35861: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.35868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.35879: variable 'omit' from source: magic vars 51243 1727204722.36277: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.36310: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.36425: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.36430: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.36433: when evaluation is False, skipping this task 51243 1727204722.36439: _execute() done 51243 1727204722.36442: dumping result to json 51243 1727204722.36446: done dumping result, returning 51243 1727204722.36464: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-5c5d-847b-000000000066] 51243 1727204722.36468: sending task result for task 127b8e07-fff9-5c5d-847b-000000000066 51243 1727204722.36871: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000066 51243 1727204722.36875: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.36948: no more pending results, returning what we have 51243 1727204722.36952: results queue empty 51243 1727204722.36953: checking for any_errors_fatal 51243 1727204722.36957: done checking for any_errors_fatal 51243 1727204722.36958: checking for max_fail_percentage 51243 1727204722.36960: done checking for max_fail_percentage 51243 1727204722.36960: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.36961: done checking to see if all hosts have failed 51243 1727204722.36962: getting the remaining hosts for this loop 51243 1727204722.36964: done getting the remaining hosts for this loop 51243 1727204722.36970: getting the next task for host managed-node3 51243 1727204722.36976: done getting next task for host managed-node3 51243 1727204722.36980: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51243 1727204722.36983: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.37004: getting variables 51243 1727204722.37005: in VariableManager get_vars() 51243 1727204722.37055: Calling all_inventory to load vars for managed-node3 51243 1727204722.37058: Calling groups_inventory to load vars for managed-node3 51243 1727204722.37061: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.37174: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.37178: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.37182: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.37526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.37796: done with get_vars() 51243 1727204722.37808: done getting variables 51243 1727204722.37900: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.027) 0:00:04.987 ***** 51243 1727204722.37939: entering _queue_task() for managed-node3/fail 51243 1727204722.38565: worker is 1 (out of 1 available) 51243 1727204722.38579: exiting _queue_task() for managed-node3/fail 51243 1727204722.38591: done queuing things up, now waiting for results queue to drain 51243 1727204722.38593: waiting for pending results... 51243 1727204722.38931: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51243 1727204722.39094: in run() - task 127b8e07-fff9-5c5d-847b-000000000067 51243 1727204722.39118: variable 'ansible_search_path' from source: unknown 51243 1727204722.39126: variable 'ansible_search_path' from source: unknown 51243 1727204722.39183: calling self._execute() 51243 1727204722.39301: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.39314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.39329: variable 'omit' from source: magic vars 51243 1727204722.39880: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.39889: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.39980: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.39984: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.39987: when evaluation is False, skipping this task 51243 1727204722.39991: _execute() done 51243 1727204722.39994: dumping result to json 51243 1727204722.39996: done dumping result, returning 51243 1727204722.40005: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-5c5d-847b-000000000067] 51243 1727204722.40011: sending task result for task 127b8e07-fff9-5c5d-847b-000000000067 51243 1727204722.40120: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000067 51243 1727204722.40123: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.40182: no more pending results, returning what we have 51243 1727204722.40185: results queue empty 51243 1727204722.40186: checking for any_errors_fatal 51243 1727204722.40195: done checking for any_errors_fatal 51243 1727204722.40195: checking for max_fail_percentage 51243 1727204722.40197: done checking for max_fail_percentage 51243 1727204722.40198: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.40199: done checking to see if all hosts have failed 51243 1727204722.40200: getting the remaining hosts for this loop 51243 1727204722.40202: done getting the remaining hosts for this loop 51243 1727204722.40209: getting the next task for host managed-node3 51243 1727204722.40216: done getting next task for host managed-node3 51243 1727204722.40220: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51243 1727204722.40223: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.40240: getting variables 51243 1727204722.40242: in VariableManager get_vars() 51243 1727204722.40289: Calling all_inventory to load vars for managed-node3 51243 1727204722.40292: Calling groups_inventory to load vars for managed-node3 51243 1727204722.40294: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.40304: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.40306: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.40308: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.40462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.40621: done with get_vars() 51243 1727204722.40631: done getting variables 51243 1727204722.40685: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.027) 0:00:05.014 ***** 51243 1727204722.40711: entering _queue_task() for managed-node3/dnf 51243 1727204722.40955: worker is 1 (out of 1 available) 51243 1727204722.40970: exiting _queue_task() for managed-node3/dnf 51243 1727204722.40983: done queuing things up, now waiting for results queue to drain 51243 1727204722.40985: waiting for pending results... 51243 1727204722.41170: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51243 1727204722.41270: in run() - task 127b8e07-fff9-5c5d-847b-000000000068 51243 1727204722.41282: variable 'ansible_search_path' from source: unknown 51243 1727204722.41286: variable 'ansible_search_path' from source: unknown 51243 1727204722.41346: calling self._execute() 51243 1727204722.41671: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.41676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.41680: variable 'omit' from source: magic vars 51243 1727204722.41852: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.41874: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.42003: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.42246: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.42256: when evaluation is False, skipping this task 51243 1727204722.42264: _execute() done 51243 1727204722.42676: dumping result to json 51243 1727204722.42679: done dumping result, returning 51243 1727204722.42683: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-000000000068] 51243 1727204722.42686: sending task result for task 127b8e07-fff9-5c5d-847b-000000000068 51243 1727204722.42781: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000068 51243 1727204722.42785: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.42848: no more pending results, returning what we have 51243 1727204722.42852: results queue empty 51243 1727204722.42853: checking for any_errors_fatal 51243 1727204722.42860: done checking for any_errors_fatal 51243 1727204722.42861: checking for max_fail_percentage 51243 1727204722.42863: done checking for max_fail_percentage 51243 1727204722.42864: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.42865: done checking to see if all hosts have failed 51243 1727204722.42970: getting the remaining hosts for this loop 51243 1727204722.42972: done getting the remaining hosts for this loop 51243 1727204722.42979: getting the next task for host managed-node3 51243 1727204722.42986: done getting next task for host managed-node3 51243 1727204722.42991: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51243 1727204722.42994: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.43011: getting variables 51243 1727204722.43013: in VariableManager get_vars() 51243 1727204722.43060: Calling all_inventory to load vars for managed-node3 51243 1727204722.43063: Calling groups_inventory to load vars for managed-node3 51243 1727204722.43171: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.43189: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.43192: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.43195: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.43656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.44019: done with get_vars() 51243 1727204722.44035: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51243 1727204722.44123: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.034) 0:00:05.049 ***** 51243 1727204722.44162: entering _queue_task() for managed-node3/yum 51243 1727204722.44612: worker is 1 (out of 1 available) 51243 1727204722.44626: exiting _queue_task() for managed-node3/yum 51243 1727204722.44641: done queuing things up, now waiting for results queue to drain 51243 1727204722.44643: waiting for pending results... 51243 1727204722.44889: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51243 1727204722.45036: in run() - task 127b8e07-fff9-5c5d-847b-000000000069 51243 1727204722.45070: variable 'ansible_search_path' from source: unknown 51243 1727204722.45079: variable 'ansible_search_path' from source: unknown 51243 1727204722.45121: calling self._execute() 51243 1727204722.45267: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.45271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.45273: variable 'omit' from source: magic vars 51243 1727204722.45710: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.45729: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.45888: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.45900: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.45907: when evaluation is False, skipping this task 51243 1727204722.45930: _execute() done 51243 1727204722.45935: dumping result to json 51243 1727204722.45970: done dumping result, returning 51243 1727204722.45974: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-000000000069] 51243 1727204722.45977: sending task result for task 127b8e07-fff9-5c5d-847b-000000000069 51243 1727204722.46237: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000069 51243 1727204722.46240: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.46307: no more pending results, returning what we have 51243 1727204722.46311: results queue empty 51243 1727204722.46312: checking for any_errors_fatal 51243 1727204722.46318: done checking for any_errors_fatal 51243 1727204722.46319: checking for max_fail_percentage 51243 1727204722.46321: done checking for max_fail_percentage 51243 1727204722.46322: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.46323: done checking to see if all hosts have failed 51243 1727204722.46324: getting the remaining hosts for this loop 51243 1727204722.46326: done getting the remaining hosts for this loop 51243 1727204722.46330: getting the next task for host managed-node3 51243 1727204722.46341: done getting next task for host managed-node3 51243 1727204722.46346: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51243 1727204722.46357: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.46380: getting variables 51243 1727204722.46382: in VariableManager get_vars() 51243 1727204722.46442: Calling all_inventory to load vars for managed-node3 51243 1727204722.46446: Calling groups_inventory to load vars for managed-node3 51243 1727204722.46449: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.46577: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.46581: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.46585: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.46878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.47159: done with get_vars() 51243 1727204722.47173: done getting variables 51243 1727204722.47247: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.031) 0:00:05.080 ***** 51243 1727204722.47282: entering _queue_task() for managed-node3/fail 51243 1727204722.47886: worker is 1 (out of 1 available) 51243 1727204722.47895: exiting _queue_task() for managed-node3/fail 51243 1727204722.47906: done queuing things up, now waiting for results queue to drain 51243 1727204722.47907: waiting for pending results... 51243 1727204722.48155: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51243 1727204722.48160: in run() - task 127b8e07-fff9-5c5d-847b-00000000006a 51243 1727204722.48171: variable 'ansible_search_path' from source: unknown 51243 1727204722.48179: variable 'ansible_search_path' from source: unknown 51243 1727204722.48224: calling self._execute() 51243 1727204722.48351: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.48372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.48387: variable 'omit' from source: magic vars 51243 1727204722.48828: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.48849: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.48984: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.48995: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.49002: when evaluation is False, skipping this task 51243 1727204722.49021: _execute() done 51243 1727204722.49029: dumping result to json 51243 1727204722.49071: done dumping result, returning 51243 1727204722.49074: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-00000000006a] 51243 1727204722.49078: sending task result for task 127b8e07-fff9-5c5d-847b-00000000006a skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.49400: no more pending results, returning what we have 51243 1727204722.49403: results queue empty 51243 1727204722.49405: checking for any_errors_fatal 51243 1727204722.49413: done checking for any_errors_fatal 51243 1727204722.49414: checking for max_fail_percentage 51243 1727204722.49415: done checking for max_fail_percentage 51243 1727204722.49416: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.49417: done checking to see if all hosts have failed 51243 1727204722.49418: getting the remaining hosts for this loop 51243 1727204722.49420: done getting the remaining hosts for this loop 51243 1727204722.49424: getting the next task for host managed-node3 51243 1727204722.49436: done getting next task for host managed-node3 51243 1727204722.49440: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 51243 1727204722.49445: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.49471: getting variables 51243 1727204722.49473: in VariableManager get_vars() 51243 1727204722.49528: Calling all_inventory to load vars for managed-node3 51243 1727204722.49531: Calling groups_inventory to load vars for managed-node3 51243 1727204722.49536: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.49550: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.49676: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.49682: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.49987: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000006a 51243 1727204722.49991: WORKER PROCESS EXITING 51243 1727204722.50028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.50305: done with get_vars() 51243 1727204722.50318: done getting variables 51243 1727204722.50395: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.031) 0:00:05.112 ***** 51243 1727204722.50443: entering _queue_task() for managed-node3/package 51243 1727204722.50891: worker is 1 (out of 1 available) 51243 1727204722.50903: exiting _queue_task() for managed-node3/package 51243 1727204722.50915: done queuing things up, now waiting for results queue to drain 51243 1727204722.50916: waiting for pending results... 51243 1727204722.51123: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 51243 1727204722.51328: in run() - task 127b8e07-fff9-5c5d-847b-00000000006b 51243 1727204722.51351: variable 'ansible_search_path' from source: unknown 51243 1727204722.51365: variable 'ansible_search_path' from source: unknown 51243 1727204722.51419: calling self._execute() 51243 1727204722.51525: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.51544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.51559: variable 'omit' from source: magic vars 51243 1727204722.52011: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.52030: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.52200: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.52203: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.52206: when evaluation is False, skipping this task 51243 1727204722.52214: _execute() done 51243 1727204722.52226: dumping result to json 51243 1727204722.52236: done dumping result, returning 51243 1727204722.52247: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-5c5d-847b-00000000006b] 51243 1727204722.52257: sending task result for task 127b8e07-fff9-5c5d-847b-00000000006b 51243 1727204722.52409: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000006b 51243 1727204722.52412: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.52522: no more pending results, returning what we have 51243 1727204722.52526: results queue empty 51243 1727204722.52527: checking for any_errors_fatal 51243 1727204722.52538: done checking for any_errors_fatal 51243 1727204722.52539: checking for max_fail_percentage 51243 1727204722.52541: done checking for max_fail_percentage 51243 1727204722.52542: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.52543: done checking to see if all hosts have failed 51243 1727204722.52544: getting the remaining hosts for this loop 51243 1727204722.52547: done getting the remaining hosts for this loop 51243 1727204722.52552: getting the next task for host managed-node3 51243 1727204722.52560: done getting next task for host managed-node3 51243 1727204722.52566: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51243 1727204722.52570: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.52592: getting variables 51243 1727204722.52594: in VariableManager get_vars() 51243 1727204722.52647: Calling all_inventory to load vars for managed-node3 51243 1727204722.52650: Calling groups_inventory to load vars for managed-node3 51243 1727204722.52652: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.52870: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.52875: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.52886: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.53235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.53498: done with get_vars() 51243 1727204722.53510: done getting variables 51243 1727204722.53591: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.031) 0:00:05.144 ***** 51243 1727204722.53626: entering _queue_task() for managed-node3/package 51243 1727204722.54061: worker is 1 (out of 1 available) 51243 1727204722.54177: exiting _queue_task() for managed-node3/package 51243 1727204722.54195: done queuing things up, now waiting for results queue to drain 51243 1727204722.54197: waiting for pending results... 51243 1727204722.54536: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51243 1727204722.54629: in run() - task 127b8e07-fff9-5c5d-847b-00000000006c 51243 1727204722.54636: variable 'ansible_search_path' from source: unknown 51243 1727204722.54640: variable 'ansible_search_path' from source: unknown 51243 1727204722.54667: calling self._execute() 51243 1727204722.54777: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.54791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.54816: variable 'omit' from source: magic vars 51243 1727204722.55352: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.55356: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.55484: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.55502: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.55511: when evaluation is False, skipping this task 51243 1727204722.55519: _execute() done 51243 1727204722.55527: dumping result to json 51243 1727204722.55569: done dumping result, returning 51243 1727204722.55574: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-5c5d-847b-00000000006c] 51243 1727204722.55578: sending task result for task 127b8e07-fff9-5c5d-847b-00000000006c skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.55922: no more pending results, returning what we have 51243 1727204722.55927: results queue empty 51243 1727204722.55928: checking for any_errors_fatal 51243 1727204722.55939: done checking for any_errors_fatal 51243 1727204722.55940: checking for max_fail_percentage 51243 1727204722.55942: done checking for max_fail_percentage 51243 1727204722.55943: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.55944: done checking to see if all hosts have failed 51243 1727204722.55945: getting the remaining hosts for this loop 51243 1727204722.55947: done getting the remaining hosts for this loop 51243 1727204722.55951: getting the next task for host managed-node3 51243 1727204722.55959: done getting next task for host managed-node3 51243 1727204722.55964: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51243 1727204722.55970: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.55991: getting variables 51243 1727204722.55993: in VariableManager get_vars() 51243 1727204722.56056: Calling all_inventory to load vars for managed-node3 51243 1727204722.56060: Calling groups_inventory to load vars for managed-node3 51243 1727204722.56063: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.56073: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000006c 51243 1727204722.56076: WORKER PROCESS EXITING 51243 1727204722.56205: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.56209: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.56213: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.56566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.57245: done with get_vars() 51243 1727204722.57264: done getting variables 51243 1727204722.57446: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.038) 0:00:05.182 ***** 51243 1727204722.57489: entering _queue_task() for managed-node3/package 51243 1727204722.58107: worker is 1 (out of 1 available) 51243 1727204722.58120: exiting _queue_task() for managed-node3/package 51243 1727204722.58137: done queuing things up, now waiting for results queue to drain 51243 1727204722.58139: waiting for pending results... 51243 1727204722.58645: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51243 1727204722.58824: in run() - task 127b8e07-fff9-5c5d-847b-00000000006d 51243 1727204722.58828: variable 'ansible_search_path' from source: unknown 51243 1727204722.58831: variable 'ansible_search_path' from source: unknown 51243 1727204722.58873: calling self._execute() 51243 1727204722.58939: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.58943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.58951: variable 'omit' from source: magic vars 51243 1727204722.59941: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.59971: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.60085: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.60089: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.60093: when evaluation is False, skipping this task 51243 1727204722.60095: _execute() done 51243 1727204722.60126: dumping result to json 51243 1727204722.60129: done dumping result, returning 51243 1727204722.60140: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-5c5d-847b-00000000006d] 51243 1727204722.60144: sending task result for task 127b8e07-fff9-5c5d-847b-00000000006d 51243 1727204722.60236: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000006d 51243 1727204722.60241: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.60296: no more pending results, returning what we have 51243 1727204722.60299: results queue empty 51243 1727204722.60301: checking for any_errors_fatal 51243 1727204722.60308: done checking for any_errors_fatal 51243 1727204722.60308: checking for max_fail_percentage 51243 1727204722.60310: done checking for max_fail_percentage 51243 1727204722.60311: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.60312: done checking to see if all hosts have failed 51243 1727204722.60312: getting the remaining hosts for this loop 51243 1727204722.60314: done getting the remaining hosts for this loop 51243 1727204722.60319: getting the next task for host managed-node3 51243 1727204722.60326: done getting next task for host managed-node3 51243 1727204722.60331: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51243 1727204722.60336: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.60356: getting variables 51243 1727204722.60358: in VariableManager get_vars() 51243 1727204722.60425: Calling all_inventory to load vars for managed-node3 51243 1727204722.60429: Calling groups_inventory to load vars for managed-node3 51243 1727204722.60431: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.60680: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.60684: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.60687: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.61181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.61882: done with get_vars() 51243 1727204722.61898: done getting variables 51243 1727204722.62086: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.047) 0:00:05.230 ***** 51243 1727204722.62286: entering _queue_task() for managed-node3/service 51243 1727204722.62993: worker is 1 (out of 1 available) 51243 1727204722.63122: exiting _queue_task() for managed-node3/service 51243 1727204722.63140: done queuing things up, now waiting for results queue to drain 51243 1727204722.63141: waiting for pending results... 51243 1727204722.64189: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51243 1727204722.64195: in run() - task 127b8e07-fff9-5c5d-847b-00000000006e 51243 1727204722.64198: variable 'ansible_search_path' from source: unknown 51243 1727204722.64201: variable 'ansible_search_path' from source: unknown 51243 1727204722.64204: calling self._execute() 51243 1727204722.64295: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.64300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.64311: variable 'omit' from source: magic vars 51243 1727204722.65513: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.65518: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.65785: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.65790: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.65793: when evaluation is False, skipping this task 51243 1727204722.65796: _execute() done 51243 1727204722.65801: dumping result to json 51243 1727204722.65804: done dumping result, returning 51243 1727204722.65816: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-00000000006e] 51243 1727204722.65822: sending task result for task 127b8e07-fff9-5c5d-847b-00000000006e skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.66308: no more pending results, returning what we have 51243 1727204722.66312: results queue empty 51243 1727204722.66314: checking for any_errors_fatal 51243 1727204722.66323: done checking for any_errors_fatal 51243 1727204722.66324: checking for max_fail_percentage 51243 1727204722.66326: done checking for max_fail_percentage 51243 1727204722.66327: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.66328: done checking to see if all hosts have failed 51243 1727204722.66329: getting the remaining hosts for this loop 51243 1727204722.66331: done getting the remaining hosts for this loop 51243 1727204722.66339: getting the next task for host managed-node3 51243 1727204722.66349: done getting next task for host managed-node3 51243 1727204722.66354: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51243 1727204722.66359: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.66380: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000006e 51243 1727204722.66384: WORKER PROCESS EXITING 51243 1727204722.66599: getting variables 51243 1727204722.66601: in VariableManager get_vars() 51243 1727204722.66659: Calling all_inventory to load vars for managed-node3 51243 1727204722.66663: Calling groups_inventory to load vars for managed-node3 51243 1727204722.66668: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.66682: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.66685: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.66689: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.67317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.68082: done with get_vars() 51243 1727204722.68104: done getting variables 51243 1727204722.68239: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.060) 0:00:05.290 ***** 51243 1727204722.68277: entering _queue_task() for managed-node3/service 51243 1727204722.68759: worker is 1 (out of 1 available) 51243 1727204722.68775: exiting _queue_task() for managed-node3/service 51243 1727204722.68789: done queuing things up, now waiting for results queue to drain 51243 1727204722.68791: waiting for pending results... 51243 1727204722.69196: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51243 1727204722.69273: in run() - task 127b8e07-fff9-5c5d-847b-00000000006f 51243 1727204722.69278: variable 'ansible_search_path' from source: unknown 51243 1727204722.69282: variable 'ansible_search_path' from source: unknown 51243 1727204722.69286: calling self._execute() 51243 1727204722.69390: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.69396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.69415: variable 'omit' from source: magic vars 51243 1727204722.69968: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.70107: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.70113: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.70126: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.70130: when evaluation is False, skipping this task 51243 1727204722.70135: _execute() done 51243 1727204722.70138: dumping result to json 51243 1727204722.70140: done dumping result, returning 51243 1727204722.70153: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-5c5d-847b-00000000006f] 51243 1727204722.70173: sending task result for task 127b8e07-fff9-5c5d-847b-00000000006f skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51243 1727204722.70341: no more pending results, returning what we have 51243 1727204722.70345: results queue empty 51243 1727204722.70347: checking for any_errors_fatal 51243 1727204722.70360: done checking for any_errors_fatal 51243 1727204722.70362: checking for max_fail_percentage 51243 1727204722.70364: done checking for max_fail_percentage 51243 1727204722.70368: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.70369: done checking to see if all hosts have failed 51243 1727204722.70370: getting the remaining hosts for this loop 51243 1727204722.70372: done getting the remaining hosts for this loop 51243 1727204722.70377: getting the next task for host managed-node3 51243 1727204722.70386: done getting next task for host managed-node3 51243 1727204722.70390: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51243 1727204722.70394: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.70416: getting variables 51243 1727204722.70418: in VariableManager get_vars() 51243 1727204722.70590: Calling all_inventory to load vars for managed-node3 51243 1727204722.70594: Calling groups_inventory to load vars for managed-node3 51243 1727204722.70597: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.70613: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.70616: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.70620: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.71473: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000006f 51243 1727204722.71525: WORKER PROCESS EXITING 51243 1727204722.71594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.72072: done with get_vars() 51243 1727204722.72089: done getting variables 51243 1727204722.72159: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.039) 0:00:05.329 ***** 51243 1727204722.72199: entering _queue_task() for managed-node3/service 51243 1727204722.72669: worker is 1 (out of 1 available) 51243 1727204722.72682: exiting _queue_task() for managed-node3/service 51243 1727204722.72695: done queuing things up, now waiting for results queue to drain 51243 1727204722.72696: waiting for pending results... 51243 1727204722.72906: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51243 1727204722.73042: in run() - task 127b8e07-fff9-5c5d-847b-000000000070 51243 1727204722.73057: variable 'ansible_search_path' from source: unknown 51243 1727204722.73060: variable 'ansible_search_path' from source: unknown 51243 1727204722.73107: calling self._execute() 51243 1727204722.73204: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.73210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.73223: variable 'omit' from source: magic vars 51243 1727204722.73657: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.73670: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.73804: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.73810: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.73813: when evaluation is False, skipping this task 51243 1727204722.73816: _execute() done 51243 1727204722.73819: dumping result to json 51243 1727204722.73823: done dumping result, returning 51243 1727204722.73836: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-5c5d-847b-000000000070] 51243 1727204722.73839: sending task result for task 127b8e07-fff9-5c5d-847b-000000000070 51243 1727204722.73952: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000070 51243 1727204722.73954: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.74009: no more pending results, returning what we have 51243 1727204722.74013: results queue empty 51243 1727204722.74014: checking for any_errors_fatal 51243 1727204722.74023: done checking for any_errors_fatal 51243 1727204722.74024: checking for max_fail_percentage 51243 1727204722.74026: done checking for max_fail_percentage 51243 1727204722.74027: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.74028: done checking to see if all hosts have failed 51243 1727204722.74029: getting the remaining hosts for this loop 51243 1727204722.74031: done getting the remaining hosts for this loop 51243 1727204722.74035: getting the next task for host managed-node3 51243 1727204722.74042: done getting next task for host managed-node3 51243 1727204722.74046: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 51243 1727204722.74050: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.74071: getting variables 51243 1727204722.74073: in VariableManager get_vars() 51243 1727204722.74126: Calling all_inventory to load vars for managed-node3 51243 1727204722.74129: Calling groups_inventory to load vars for managed-node3 51243 1727204722.74132: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.74147: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.74150: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.74153: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.74584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.74884: done with get_vars() 51243 1727204722.74896: done getting variables 51243 1727204722.74963: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.027) 0:00:05.357 ***** 51243 1727204722.74998: entering _queue_task() for managed-node3/service 51243 1727204722.75326: worker is 1 (out of 1 available) 51243 1727204722.75341: exiting _queue_task() for managed-node3/service 51243 1727204722.75468: done queuing things up, now waiting for results queue to drain 51243 1727204722.75471: waiting for pending results... 51243 1727204722.75824: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 51243 1727204722.76074: in run() - task 127b8e07-fff9-5c5d-847b-000000000071 51243 1727204722.76078: variable 'ansible_search_path' from source: unknown 51243 1727204722.76081: variable 'ansible_search_path' from source: unknown 51243 1727204722.76085: calling self._execute() 51243 1727204722.76088: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.76091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.76093: variable 'omit' from source: magic vars 51243 1727204722.76445: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.76456: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.76577: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.76595: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.76599: when evaluation is False, skipping this task 51243 1727204722.76602: _execute() done 51243 1727204722.76604: dumping result to json 51243 1727204722.76607: done dumping result, returning 51243 1727204722.76616: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-5c5d-847b-000000000071] 51243 1727204722.76622: sending task result for task 127b8e07-fff9-5c5d-847b-000000000071 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51243 1727204722.76773: no more pending results, returning what we have 51243 1727204722.76776: results queue empty 51243 1727204722.76778: checking for any_errors_fatal 51243 1727204722.76788: done checking for any_errors_fatal 51243 1727204722.76789: checking for max_fail_percentage 51243 1727204722.76791: done checking for max_fail_percentage 51243 1727204722.76792: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.76793: done checking to see if all hosts have failed 51243 1727204722.76794: getting the remaining hosts for this loop 51243 1727204722.76796: done getting the remaining hosts for this loop 51243 1727204722.76802: getting the next task for host managed-node3 51243 1727204722.76810: done getting next task for host managed-node3 51243 1727204722.76813: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51243 1727204722.76819: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.76838: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000071 51243 1727204722.76842: WORKER PROCESS EXITING 51243 1727204722.76878: getting variables 51243 1727204722.76881: in VariableManager get_vars() 51243 1727204722.77172: Calling all_inventory to load vars for managed-node3 51243 1727204722.77176: Calling groups_inventory to load vars for managed-node3 51243 1727204722.77179: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.77191: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.77194: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.77197: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.77419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.77705: done with get_vars() 51243 1727204722.77724: done getting variables 51243 1727204722.77793: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.028) 0:00:05.386 ***** 51243 1727204722.77834: entering _queue_task() for managed-node3/copy 51243 1727204722.78395: worker is 1 (out of 1 available) 51243 1727204722.78406: exiting _queue_task() for managed-node3/copy 51243 1727204722.78417: done queuing things up, now waiting for results queue to drain 51243 1727204722.78419: waiting for pending results... 51243 1727204722.78691: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51243 1727204722.78697: in run() - task 127b8e07-fff9-5c5d-847b-000000000072 51243 1727204722.78700: variable 'ansible_search_path' from source: unknown 51243 1727204722.78704: variable 'ansible_search_path' from source: unknown 51243 1727204722.78730: calling self._execute() 51243 1727204722.79017: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.79021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.79023: variable 'omit' from source: magic vars 51243 1727204722.80048: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.80053: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.80272: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.80276: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.80279: when evaluation is False, skipping this task 51243 1727204722.80282: _execute() done 51243 1727204722.80285: dumping result to json 51243 1727204722.80288: done dumping result, returning 51243 1727204722.80294: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-5c5d-847b-000000000072] 51243 1727204722.80472: sending task result for task 127b8e07-fff9-5c5d-847b-000000000072 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.80606: no more pending results, returning what we have 51243 1727204722.80610: results queue empty 51243 1727204722.80611: checking for any_errors_fatal 51243 1727204722.80619: done checking for any_errors_fatal 51243 1727204722.80619: checking for max_fail_percentage 51243 1727204722.80621: done checking for max_fail_percentage 51243 1727204722.80622: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.80623: done checking to see if all hosts have failed 51243 1727204722.80624: getting the remaining hosts for this loop 51243 1727204722.80626: done getting the remaining hosts for this loop 51243 1727204722.80631: getting the next task for host managed-node3 51243 1727204722.80639: done getting next task for host managed-node3 51243 1727204722.80643: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51243 1727204722.80647: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.80677: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000072 51243 1727204722.80681: WORKER PROCESS EXITING 51243 1727204722.80692: getting variables 51243 1727204722.80694: in VariableManager get_vars() 51243 1727204722.80753: Calling all_inventory to load vars for managed-node3 51243 1727204722.80757: Calling groups_inventory to load vars for managed-node3 51243 1727204722.80759: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.80776: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.80780: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.80783: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.81691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.82181: done with get_vars() 51243 1727204722.82284: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.046) 0:00:05.433 ***** 51243 1727204722.82536: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 51243 1727204722.83076: worker is 1 (out of 1 available) 51243 1727204722.83090: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 51243 1727204722.83102: done queuing things up, now waiting for results queue to drain 51243 1727204722.83104: waiting for pending results... 51243 1727204722.83387: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51243 1727204722.83543: in run() - task 127b8e07-fff9-5c5d-847b-000000000073 51243 1727204722.83561: variable 'ansible_search_path' from source: unknown 51243 1727204722.83566: variable 'ansible_search_path' from source: unknown 51243 1727204722.83625: calling self._execute() 51243 1727204722.83712: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.83727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.83738: variable 'omit' from source: magic vars 51243 1727204722.84371: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.84375: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.84771: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.84815: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.84819: when evaluation is False, skipping this task 51243 1727204722.84822: _execute() done 51243 1727204722.84826: dumping result to json 51243 1727204722.84829: done dumping result, returning 51243 1727204722.84838: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-5c5d-847b-000000000073] 51243 1727204722.84841: sending task result for task 127b8e07-fff9-5c5d-847b-000000000073 51243 1727204722.85172: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000073 51243 1727204722.85177: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.85234: no more pending results, returning what we have 51243 1727204722.85237: results queue empty 51243 1727204722.85239: checking for any_errors_fatal 51243 1727204722.85248: done checking for any_errors_fatal 51243 1727204722.85249: checking for max_fail_percentage 51243 1727204722.85250: done checking for max_fail_percentage 51243 1727204722.85251: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.85252: done checking to see if all hosts have failed 51243 1727204722.85253: getting the remaining hosts for this loop 51243 1727204722.85255: done getting the remaining hosts for this loop 51243 1727204722.85259: getting the next task for host managed-node3 51243 1727204722.85268: done getting next task for host managed-node3 51243 1727204722.85273: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 51243 1727204722.85277: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.85295: getting variables 51243 1727204722.85297: in VariableManager get_vars() 51243 1727204722.85350: Calling all_inventory to load vars for managed-node3 51243 1727204722.85353: Calling groups_inventory to load vars for managed-node3 51243 1727204722.85356: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.85771: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.85776: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.85781: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.86191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.86626: done with get_vars() 51243 1727204722.86641: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.041) 0:00:05.475 ***** 51243 1727204722.86726: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 51243 1727204722.87495: worker is 1 (out of 1 available) 51243 1727204722.87512: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 51243 1727204722.87525: done queuing things up, now waiting for results queue to drain 51243 1727204722.87527: waiting for pending results... 51243 1727204722.88091: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 51243 1727204722.88328: in run() - task 127b8e07-fff9-5c5d-847b-000000000074 51243 1727204722.88427: variable 'ansible_search_path' from source: unknown 51243 1727204722.88431: variable 'ansible_search_path' from source: unknown 51243 1727204722.88434: calling self._execute() 51243 1727204722.88594: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.88598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.88610: variable 'omit' from source: magic vars 51243 1727204722.89569: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.89581: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.90102: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.90106: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.90109: when evaluation is False, skipping this task 51243 1727204722.90111: _execute() done 51243 1727204722.90114: dumping result to json 51243 1727204722.90117: done dumping result, returning 51243 1727204722.90128: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-5c5d-847b-000000000074] 51243 1727204722.90160: sending task result for task 127b8e07-fff9-5c5d-847b-000000000074 51243 1727204722.90244: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000074 51243 1727204722.90247: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204722.90322: no more pending results, returning what we have 51243 1727204722.90325: results queue empty 51243 1727204722.90327: checking for any_errors_fatal 51243 1727204722.90335: done checking for any_errors_fatal 51243 1727204722.90336: checking for max_fail_percentage 51243 1727204722.90338: done checking for max_fail_percentage 51243 1727204722.90338: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.90339: done checking to see if all hosts have failed 51243 1727204722.90340: getting the remaining hosts for this loop 51243 1727204722.90342: done getting the remaining hosts for this loop 51243 1727204722.90346: getting the next task for host managed-node3 51243 1727204722.90353: done getting next task for host managed-node3 51243 1727204722.90357: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51243 1727204722.90360: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.90383: getting variables 51243 1727204722.90385: in VariableManager get_vars() 51243 1727204722.90441: Calling all_inventory to load vars for managed-node3 51243 1727204722.90445: Calling groups_inventory to load vars for managed-node3 51243 1727204722.90447: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.90463: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.90670: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.90676: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.91123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.91475: done with get_vars() 51243 1727204722.91489: done getting variables 51243 1727204722.91554: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.050) 0:00:05.525 ***** 51243 1727204722.91799: entering _queue_task() for managed-node3/debug 51243 1727204722.92445: worker is 1 (out of 1 available) 51243 1727204722.92462: exiting _queue_task() for managed-node3/debug 51243 1727204722.92687: done queuing things up, now waiting for results queue to drain 51243 1727204722.92689: waiting for pending results... 51243 1727204722.92991: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51243 1727204722.93005: in run() - task 127b8e07-fff9-5c5d-847b-000000000075 51243 1727204722.93046: variable 'ansible_search_path' from source: unknown 51243 1727204722.93055: variable 'ansible_search_path' from source: unknown 51243 1727204722.93117: calling self._execute() 51243 1727204722.93214: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.93240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.93335: variable 'omit' from source: magic vars 51243 1727204722.93746: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.93767: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.93919: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.93931: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.93938: when evaluation is False, skipping this task 51243 1727204722.93947: _execute() done 51243 1727204722.93954: dumping result to json 51243 1727204722.93963: done dumping result, returning 51243 1727204722.93979: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-5c5d-847b-000000000075] 51243 1727204722.93995: sending task result for task 127b8e07-fff9-5c5d-847b-000000000075 51243 1727204722.94261: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000075 51243 1727204722.94264: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204722.94322: no more pending results, returning what we have 51243 1727204722.94326: results queue empty 51243 1727204722.94327: checking for any_errors_fatal 51243 1727204722.94334: done checking for any_errors_fatal 51243 1727204722.94335: checking for max_fail_percentage 51243 1727204722.94337: done checking for max_fail_percentage 51243 1727204722.94337: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.94338: done checking to see if all hosts have failed 51243 1727204722.94340: getting the remaining hosts for this loop 51243 1727204722.94342: done getting the remaining hosts for this loop 51243 1727204722.94347: getting the next task for host managed-node3 51243 1727204722.94355: done getting next task for host managed-node3 51243 1727204722.94359: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51243 1727204722.94368: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.94392: getting variables 51243 1727204722.94394: in VariableManager get_vars() 51243 1727204722.94454: Calling all_inventory to load vars for managed-node3 51243 1727204722.94457: Calling groups_inventory to load vars for managed-node3 51243 1727204722.94460: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.94588: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.94592: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.94597: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.94918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.95190: done with get_vars() 51243 1727204722.95204: done getting variables 51243 1727204722.95288: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.035) 0:00:05.561 ***** 51243 1727204722.95325: entering _queue_task() for managed-node3/debug 51243 1727204722.95807: worker is 1 (out of 1 available) 51243 1727204722.95822: exiting _queue_task() for managed-node3/debug 51243 1727204722.95835: done queuing things up, now waiting for results queue to drain 51243 1727204722.95837: waiting for pending results... 51243 1727204722.96119: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51243 1727204722.96293: in run() - task 127b8e07-fff9-5c5d-847b-000000000076 51243 1727204722.96374: variable 'ansible_search_path' from source: unknown 51243 1727204722.96378: variable 'ansible_search_path' from source: unknown 51243 1727204722.96389: calling self._execute() 51243 1727204722.96496: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204722.96510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204722.96524: variable 'omit' from source: magic vars 51243 1727204722.97435: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.97519: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204722.97604: variable 'ansible_distribution_major_version' from source: facts 51243 1727204722.97618: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204722.97634: when evaluation is False, skipping this task 51243 1727204722.97643: _execute() done 51243 1727204722.97650: dumping result to json 51243 1727204722.97747: done dumping result, returning 51243 1727204722.97751: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-5c5d-847b-000000000076] 51243 1727204722.97754: sending task result for task 127b8e07-fff9-5c5d-847b-000000000076 51243 1727204722.97833: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000076 51243 1727204722.97837: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204722.97895: no more pending results, returning what we have 51243 1727204722.97899: results queue empty 51243 1727204722.97900: checking for any_errors_fatal 51243 1727204722.97908: done checking for any_errors_fatal 51243 1727204722.97909: checking for max_fail_percentage 51243 1727204722.97911: done checking for max_fail_percentage 51243 1727204722.97912: checking to see if all hosts have failed and the running result is not ok 51243 1727204722.97913: done checking to see if all hosts have failed 51243 1727204722.97914: getting the remaining hosts for this loop 51243 1727204722.97917: done getting the remaining hosts for this loop 51243 1727204722.97922: getting the next task for host managed-node3 51243 1727204722.97930: done getting next task for host managed-node3 51243 1727204722.97934: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51243 1727204722.97938: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204722.97958: getting variables 51243 1727204722.97961: in VariableManager get_vars() 51243 1727204722.98020: Calling all_inventory to load vars for managed-node3 51243 1727204722.98023: Calling groups_inventory to load vars for managed-node3 51243 1727204722.98026: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204722.98041: Calling all_plugins_play to load vars for managed-node3 51243 1727204722.98045: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204722.98048: Calling groups_plugins_play to load vars for managed-node3 51243 1727204722.98881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204722.99152: done with get_vars() 51243 1727204722.99167: done getting variables 51243 1727204722.99238: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:22 -0400 (0:00:00.039) 0:00:05.600 ***** 51243 1727204722.99285: entering _queue_task() for managed-node3/debug 51243 1727204722.99733: worker is 1 (out of 1 available) 51243 1727204722.99745: exiting _queue_task() for managed-node3/debug 51243 1727204722.99757: done queuing things up, now waiting for results queue to drain 51243 1727204722.99758: waiting for pending results... 51243 1727204723.00085: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51243 1727204723.00126: in run() - task 127b8e07-fff9-5c5d-847b-000000000077 51243 1727204723.00158: variable 'ansible_search_path' from source: unknown 51243 1727204723.00173: variable 'ansible_search_path' from source: unknown 51243 1727204723.00246: calling self._execute() 51243 1727204723.00318: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.00333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.00355: variable 'omit' from source: magic vars 51243 1727204723.00873: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.00877: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.00993: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.01012: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.01020: when evaluation is False, skipping this task 51243 1727204723.01027: _execute() done 51243 1727204723.01035: dumping result to json 51243 1727204723.01121: done dumping result, returning 51243 1727204723.01125: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-5c5d-847b-000000000077] 51243 1727204723.01128: sending task result for task 127b8e07-fff9-5c5d-847b-000000000077 skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204723.01282: no more pending results, returning what we have 51243 1727204723.01286: results queue empty 51243 1727204723.01287: checking for any_errors_fatal 51243 1727204723.01297: done checking for any_errors_fatal 51243 1727204723.01298: checking for max_fail_percentage 51243 1727204723.01300: done checking for max_fail_percentage 51243 1727204723.01301: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.01302: done checking to see if all hosts have failed 51243 1727204723.01303: getting the remaining hosts for this loop 51243 1727204723.01305: done getting the remaining hosts for this loop 51243 1727204723.01310: getting the next task for host managed-node3 51243 1727204723.01318: done getting next task for host managed-node3 51243 1727204723.01323: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 51243 1727204723.01327: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.01348: getting variables 51243 1727204723.01350: in VariableManager get_vars() 51243 1727204723.01409: Calling all_inventory to load vars for managed-node3 51243 1727204723.01634: Calling groups_inventory to load vars for managed-node3 51243 1727204723.01642: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.01659: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.01662: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.01668: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.02175: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000077 51243 1727204723.02179: WORKER PROCESS EXITING 51243 1727204723.02216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.02930: done with get_vars() 51243 1727204723.02945: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.038) 0:00:05.639 ***** 51243 1727204723.03234: entering _queue_task() for managed-node3/ping 51243 1727204723.04108: worker is 1 (out of 1 available) 51243 1727204723.04121: exiting _queue_task() for managed-node3/ping 51243 1727204723.04133: done queuing things up, now waiting for results queue to drain 51243 1727204723.04135: waiting for pending results... 51243 1727204723.04325: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 51243 1727204723.04488: in run() - task 127b8e07-fff9-5c5d-847b-000000000078 51243 1727204723.04776: variable 'ansible_search_path' from source: unknown 51243 1727204723.04780: variable 'ansible_search_path' from source: unknown 51243 1727204723.04783: calling self._execute() 51243 1727204723.04906: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.04918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.04934: variable 'omit' from source: magic vars 51243 1727204723.05581: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.05783: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.05853: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.05875: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.05883: when evaluation is False, skipping this task 51243 1727204723.05891: _execute() done 51243 1727204723.05898: dumping result to json 51243 1727204723.05905: done dumping result, returning 51243 1727204723.05917: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-5c5d-847b-000000000078] 51243 1727204723.05927: sending task result for task 127b8e07-fff9-5c5d-847b-000000000078 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.06425: no more pending results, returning what we have 51243 1727204723.06429: results queue empty 51243 1727204723.06430: checking for any_errors_fatal 51243 1727204723.06438: done checking for any_errors_fatal 51243 1727204723.06439: checking for max_fail_percentage 51243 1727204723.06441: done checking for max_fail_percentage 51243 1727204723.06442: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.06443: done checking to see if all hosts have failed 51243 1727204723.06444: getting the remaining hosts for this loop 51243 1727204723.06446: done getting the remaining hosts for this loop 51243 1727204723.06451: getting the next task for host managed-node3 51243 1727204723.06570: done getting next task for host managed-node3 51243 1727204723.06573: ^ task is: TASK: meta (role_complete) 51243 1727204723.06577: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.06600: getting variables 51243 1727204723.06602: in VariableManager get_vars() 51243 1727204723.06662: Calling all_inventory to load vars for managed-node3 51243 1727204723.06871: Calling groups_inventory to load vars for managed-node3 51243 1727204723.06876: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.06891: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.06894: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.06897: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.07746: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000078 51243 1727204723.07750: WORKER PROCESS EXITING 51243 1727204723.07789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.08434: done with get_vars() 51243 1727204723.08451: done getting variables 51243 1727204723.08752: done queuing things up, now waiting for results queue to drain 51243 1727204723.08755: results queue empty 51243 1727204723.08756: checking for any_errors_fatal 51243 1727204723.08759: done checking for any_errors_fatal 51243 1727204723.08760: checking for max_fail_percentage 51243 1727204723.08761: done checking for max_fail_percentage 51243 1727204723.08762: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.08762: done checking to see if all hosts have failed 51243 1727204723.08763: getting the remaining hosts for this loop 51243 1727204723.08764: done getting the remaining hosts for this loop 51243 1727204723.08771: getting the next task for host managed-node3 51243 1727204723.08776: done getting next task for host managed-node3 51243 1727204723.08778: ^ task is: TASK: TEST: wireless connection with 802.1x TLS-EAP 51243 1727204723.08780: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.08783: getting variables 51243 1727204723.08784: in VariableManager get_vars() 51243 1727204723.08809: Calling all_inventory to load vars for managed-node3 51243 1727204723.08811: Calling groups_inventory to load vars for managed-node3 51243 1727204723.08814: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.08820: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.08823: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.08825: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.09554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.10009: done with get_vars() 51243 1727204723.10024: done getting variables 51243 1727204723.10304: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with 802.1x TLS-EAP] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.071) 0:00:05.711 ***** 51243 1727204723.10335: entering _queue_task() for managed-node3/debug 51243 1727204723.11307: worker is 1 (out of 1 available) 51243 1727204723.11320: exiting _queue_task() for managed-node3/debug 51243 1727204723.11330: done queuing things up, now waiting for results queue to drain 51243 1727204723.11332: waiting for pending results... 51243 1727204723.11670: running TaskExecutor() for managed-node3/TASK: TEST: wireless connection with 802.1x TLS-EAP 51243 1727204723.12074: in run() - task 127b8e07-fff9-5c5d-847b-0000000000a8 51243 1727204723.12079: variable 'ansible_search_path' from source: unknown 51243 1727204723.12082: calling self._execute() 51243 1727204723.12259: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.12319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.12337: variable 'omit' from source: magic vars 51243 1727204723.13282: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.13306: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.13583: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.13594: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.13600: when evaluation is False, skipping this task 51243 1727204723.13607: _execute() done 51243 1727204723.13621: dumping result to json 51243 1727204723.13630: done dumping result, returning 51243 1727204723.13724: done running TaskExecutor() for managed-node3/TASK: TEST: wireless connection with 802.1x TLS-EAP [127b8e07-fff9-5c5d-847b-0000000000a8] 51243 1727204723.13728: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000a8 51243 1727204723.13810: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000a8 51243 1727204723.13813: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204723.13881: no more pending results, returning what we have 51243 1727204723.13885: results queue empty 51243 1727204723.13886: checking for any_errors_fatal 51243 1727204723.13889: done checking for any_errors_fatal 51243 1727204723.13890: checking for max_fail_percentage 51243 1727204723.13892: done checking for max_fail_percentage 51243 1727204723.13892: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.13893: done checking to see if all hosts have failed 51243 1727204723.13894: getting the remaining hosts for this loop 51243 1727204723.13896: done getting the remaining hosts for this loop 51243 1727204723.13901: getting the next task for host managed-node3 51243 1727204723.13910: done getting next task for host managed-node3 51243 1727204723.13916: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51243 1727204723.13921: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.13943: getting variables 51243 1727204723.13946: in VariableManager get_vars() 51243 1727204723.14003: Calling all_inventory to load vars for managed-node3 51243 1727204723.14007: Calling groups_inventory to load vars for managed-node3 51243 1727204723.14010: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.14025: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.14028: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.14031: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.14735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.15395: done with get_vars() 51243 1727204723.15410: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.053) 0:00:05.764 ***** 51243 1727204723.15721: entering _queue_task() for managed-node3/include_tasks 51243 1727204723.16274: worker is 1 (out of 1 available) 51243 1727204723.16290: exiting _queue_task() for managed-node3/include_tasks 51243 1727204723.16304: done queuing things up, now waiting for results queue to drain 51243 1727204723.16305: waiting for pending results... 51243 1727204723.16887: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51243 1727204723.17119: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b0 51243 1727204723.17224: variable 'ansible_search_path' from source: unknown 51243 1727204723.17229: variable 'ansible_search_path' from source: unknown 51243 1727204723.17233: calling self._execute() 51243 1727204723.17550: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.17554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.17564: variable 'omit' from source: magic vars 51243 1727204723.18458: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.18472: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.18796: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.18802: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.18806: when evaluation is False, skipping this task 51243 1727204723.18809: _execute() done 51243 1727204723.18813: dumping result to json 51243 1727204723.18816: done dumping result, returning 51243 1727204723.18946: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-5c5d-847b-0000000000b0] 51243 1727204723.18950: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b0 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.19159: no more pending results, returning what we have 51243 1727204723.19163: results queue empty 51243 1727204723.19164: checking for any_errors_fatal 51243 1727204723.19176: done checking for any_errors_fatal 51243 1727204723.19177: checking for max_fail_percentage 51243 1727204723.19179: done checking for max_fail_percentage 51243 1727204723.19180: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.19181: done checking to see if all hosts have failed 51243 1727204723.19182: getting the remaining hosts for this loop 51243 1727204723.19183: done getting the remaining hosts for this loop 51243 1727204723.19188: getting the next task for host managed-node3 51243 1727204723.19196: done getting next task for host managed-node3 51243 1727204723.19200: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 51243 1727204723.19203: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.19230: getting variables 51243 1727204723.19232: in VariableManager get_vars() 51243 1727204723.19346: Calling all_inventory to load vars for managed-node3 51243 1727204723.19350: Calling groups_inventory to load vars for managed-node3 51243 1727204723.19352: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.19448: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.19452: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.19456: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.20196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.20850: done with get_vars() 51243 1727204723.20975: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b0 51243 1727204723.20979: WORKER PROCESS EXITING 51243 1727204723.20980: done getting variables 51243 1727204723.21046: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.054) 0:00:05.819 ***** 51243 1727204723.21187: entering _queue_task() for managed-node3/debug 51243 1727204723.21846: worker is 1 (out of 1 available) 51243 1727204723.21861: exiting _queue_task() for managed-node3/debug 51243 1727204723.22175: done queuing things up, now waiting for results queue to drain 51243 1727204723.22177: waiting for pending results... 51243 1727204723.22383: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 51243 1727204723.22778: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b1 51243 1727204723.22803: variable 'ansible_search_path' from source: unknown 51243 1727204723.22811: variable 'ansible_search_path' from source: unknown 51243 1727204723.22858: calling self._execute() 51243 1727204723.22967: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.23110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.23124: variable 'omit' from source: magic vars 51243 1727204723.23938: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.23985: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.24271: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.24315: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.24523: when evaluation is False, skipping this task 51243 1727204723.24527: _execute() done 51243 1727204723.24529: dumping result to json 51243 1727204723.24531: done dumping result, returning 51243 1727204723.24533: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-5c5d-847b-0000000000b1] 51243 1727204723.24536: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b1 51243 1727204723.24610: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b1 51243 1727204723.24613: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204723.24678: no more pending results, returning what we have 51243 1727204723.24682: results queue empty 51243 1727204723.24683: checking for any_errors_fatal 51243 1727204723.24690: done checking for any_errors_fatal 51243 1727204723.24691: checking for max_fail_percentage 51243 1727204723.24693: done checking for max_fail_percentage 51243 1727204723.24694: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.24695: done checking to see if all hosts have failed 51243 1727204723.24695: getting the remaining hosts for this loop 51243 1727204723.24697: done getting the remaining hosts for this loop 51243 1727204723.24702: getting the next task for host managed-node3 51243 1727204723.24709: done getting next task for host managed-node3 51243 1727204723.24713: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51243 1727204723.24717: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.24740: getting variables 51243 1727204723.24742: in VariableManager get_vars() 51243 1727204723.24796: Calling all_inventory to load vars for managed-node3 51243 1727204723.24799: Calling groups_inventory to load vars for managed-node3 51243 1727204723.24801: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.24817: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.24819: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.24822: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.25252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.25510: done with get_vars() 51243 1727204723.25523: done getting variables 51243 1727204723.25591: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.044) 0:00:05.864 ***** 51243 1727204723.25626: entering _queue_task() for managed-node3/fail 51243 1727204723.25941: worker is 1 (out of 1 available) 51243 1727204723.25957: exiting _queue_task() for managed-node3/fail 51243 1727204723.25972: done queuing things up, now waiting for results queue to drain 51243 1727204723.25974: waiting for pending results... 51243 1727204723.26388: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51243 1727204723.26468: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b2 51243 1727204723.26493: variable 'ansible_search_path' from source: unknown 51243 1727204723.26501: variable 'ansible_search_path' from source: unknown 51243 1727204723.26544: calling self._execute() 51243 1727204723.26641: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.26654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.26672: variable 'omit' from source: magic vars 51243 1727204723.27673: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.27680: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.27820: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.27833: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.27840: when evaluation is False, skipping this task 51243 1727204723.27846: _execute() done 51243 1727204723.27852: dumping result to json 51243 1727204723.27858: done dumping result, returning 51243 1727204723.27915: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-5c5d-847b-0000000000b2] 51243 1727204723.27928: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b2 51243 1727204723.28202: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b2 51243 1727204723.28206: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.28268: no more pending results, returning what we have 51243 1727204723.28273: results queue empty 51243 1727204723.28274: checking for any_errors_fatal 51243 1727204723.28284: done checking for any_errors_fatal 51243 1727204723.28285: checking for max_fail_percentage 51243 1727204723.28287: done checking for max_fail_percentage 51243 1727204723.28288: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.28290: done checking to see if all hosts have failed 51243 1727204723.28290: getting the remaining hosts for this loop 51243 1727204723.28292: done getting the remaining hosts for this loop 51243 1727204723.28298: getting the next task for host managed-node3 51243 1727204723.28306: done getting next task for host managed-node3 51243 1727204723.28311: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51243 1727204723.28315: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.28341: getting variables 51243 1727204723.28343: in VariableManager get_vars() 51243 1727204723.28524: Calling all_inventory to load vars for managed-node3 51243 1727204723.28528: Calling groups_inventory to load vars for managed-node3 51243 1727204723.28531: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.28547: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.28551: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.28555: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.29033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.29289: done with get_vars() 51243 1727204723.29303: done getting variables 51243 1727204723.29375: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.037) 0:00:05.901 ***** 51243 1727204723.29413: entering _queue_task() for managed-node3/fail 51243 1727204723.29769: worker is 1 (out of 1 available) 51243 1727204723.29878: exiting _queue_task() for managed-node3/fail 51243 1727204723.29890: done queuing things up, now waiting for results queue to drain 51243 1727204723.29892: waiting for pending results... 51243 1727204723.30187: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51243 1727204723.30267: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b3 51243 1727204723.30288: variable 'ansible_search_path' from source: unknown 51243 1727204723.30295: variable 'ansible_search_path' from source: unknown 51243 1727204723.30371: calling self._execute() 51243 1727204723.30440: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.30452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.30465: variable 'omit' from source: magic vars 51243 1727204723.31375: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.31379: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.31576: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.31620: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.31628: when evaluation is False, skipping this task 51243 1727204723.31635: _execute() done 51243 1727204723.31643: dumping result to json 51243 1727204723.31718: done dumping result, returning 51243 1727204723.31723: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-5c5d-847b-0000000000b3] 51243 1727204723.31725: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b3 51243 1727204723.32295: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b3 51243 1727204723.32299: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.32344: no more pending results, returning what we have 51243 1727204723.32347: results queue empty 51243 1727204723.32348: checking for any_errors_fatal 51243 1727204723.32353: done checking for any_errors_fatal 51243 1727204723.32354: checking for max_fail_percentage 51243 1727204723.32356: done checking for max_fail_percentage 51243 1727204723.32357: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.32358: done checking to see if all hosts have failed 51243 1727204723.32359: getting the remaining hosts for this loop 51243 1727204723.32360: done getting the remaining hosts for this loop 51243 1727204723.32364: getting the next task for host managed-node3 51243 1727204723.32371: done getting next task for host managed-node3 51243 1727204723.32375: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51243 1727204723.32378: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.32397: getting variables 51243 1727204723.32399: in VariableManager get_vars() 51243 1727204723.32446: Calling all_inventory to load vars for managed-node3 51243 1727204723.32449: Calling groups_inventory to load vars for managed-node3 51243 1727204723.32452: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.32462: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.32668: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.32677: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.33092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.33558: done with get_vars() 51243 1727204723.33575: done getting variables 51243 1727204723.33639: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.044) 0:00:05.946 ***** 51243 1727204723.33882: entering _queue_task() for managed-node3/fail 51243 1727204723.34442: worker is 1 (out of 1 available) 51243 1727204723.34459: exiting _queue_task() for managed-node3/fail 51243 1727204723.34675: done queuing things up, now waiting for results queue to drain 51243 1727204723.34677: waiting for pending results... 51243 1727204723.35197: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51243 1727204723.35326: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b4 51243 1727204723.35345: variable 'ansible_search_path' from source: unknown 51243 1727204723.35349: variable 'ansible_search_path' from source: unknown 51243 1727204723.35506: calling self._execute() 51243 1727204723.35629: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.35637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.35650: variable 'omit' from source: magic vars 51243 1727204723.36536: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.36540: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.37017: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.37072: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.37077: when evaluation is False, skipping this task 51243 1727204723.37081: _execute() done 51243 1727204723.37084: dumping result to json 51243 1727204723.37086: done dumping result, returning 51243 1727204723.37090: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-5c5d-847b-0000000000b4] 51243 1727204723.37093: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b4 51243 1727204723.37342: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b4 51243 1727204723.37345: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.37391: no more pending results, returning what we have 51243 1727204723.37394: results queue empty 51243 1727204723.37395: checking for any_errors_fatal 51243 1727204723.37401: done checking for any_errors_fatal 51243 1727204723.37402: checking for max_fail_percentage 51243 1727204723.37403: done checking for max_fail_percentage 51243 1727204723.37404: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.37405: done checking to see if all hosts have failed 51243 1727204723.37406: getting the remaining hosts for this loop 51243 1727204723.37407: done getting the remaining hosts for this loop 51243 1727204723.37410: getting the next task for host managed-node3 51243 1727204723.37416: done getting next task for host managed-node3 51243 1727204723.37419: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51243 1727204723.37422: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.37440: getting variables 51243 1727204723.37441: in VariableManager get_vars() 51243 1727204723.37489: Calling all_inventory to load vars for managed-node3 51243 1727204723.37492: Calling groups_inventory to load vars for managed-node3 51243 1727204723.37494: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.37504: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.37506: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.37509: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.37964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.38619: done with get_vars() 51243 1727204723.38633: done getting variables 51243 1727204723.38701: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.048) 0:00:05.995 ***** 51243 1727204723.38735: entering _queue_task() for managed-node3/dnf 51243 1727204723.39400: worker is 1 (out of 1 available) 51243 1727204723.39417: exiting _queue_task() for managed-node3/dnf 51243 1727204723.39431: done queuing things up, now waiting for results queue to drain 51243 1727204723.39433: waiting for pending results... 51243 1727204723.40117: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51243 1727204723.40324: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b5 51243 1727204723.40344: variable 'ansible_search_path' from source: unknown 51243 1727204723.40349: variable 'ansible_search_path' from source: unknown 51243 1727204723.40517: calling self._execute() 51243 1727204723.40735: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.40744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.40761: variable 'omit' from source: magic vars 51243 1727204723.41656: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.41771: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.42053: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.42059: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.42062: when evaluation is False, skipping this task 51243 1727204723.42067: _execute() done 51243 1727204723.42070: dumping result to json 51243 1727204723.42073: done dumping result, returning 51243 1727204723.42083: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-0000000000b5] 51243 1727204723.42089: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b5 51243 1727204723.42373: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b5 51243 1727204723.42376: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.42435: no more pending results, returning what we have 51243 1727204723.42438: results queue empty 51243 1727204723.42440: checking for any_errors_fatal 51243 1727204723.42446: done checking for any_errors_fatal 51243 1727204723.42447: checking for max_fail_percentage 51243 1727204723.42448: done checking for max_fail_percentage 51243 1727204723.42449: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.42450: done checking to see if all hosts have failed 51243 1727204723.42451: getting the remaining hosts for this loop 51243 1727204723.42453: done getting the remaining hosts for this loop 51243 1727204723.42457: getting the next task for host managed-node3 51243 1727204723.42463: done getting next task for host managed-node3 51243 1727204723.42469: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51243 1727204723.42473: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.42495: getting variables 51243 1727204723.42497: in VariableManager get_vars() 51243 1727204723.42547: Calling all_inventory to load vars for managed-node3 51243 1727204723.42550: Calling groups_inventory to load vars for managed-node3 51243 1727204723.42552: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.42563: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.42667: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.42674: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.43084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.43573: done with get_vars() 51243 1727204723.43587: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51243 1727204723.43873: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.051) 0:00:06.046 ***** 51243 1727204723.43906: entering _queue_task() for managed-node3/yum 51243 1727204723.44458: worker is 1 (out of 1 available) 51243 1727204723.44727: exiting _queue_task() for managed-node3/yum 51243 1727204723.44740: done queuing things up, now waiting for results queue to drain 51243 1727204723.44742: waiting for pending results... 51243 1727204723.45000: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51243 1727204723.45315: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b6 51243 1727204723.45395: variable 'ansible_search_path' from source: unknown 51243 1727204723.45399: variable 'ansible_search_path' from source: unknown 51243 1727204723.45502: calling self._execute() 51243 1727204723.45721: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.45727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.45741: variable 'omit' from source: magic vars 51243 1727204723.46608: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.46771: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.46985: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.46992: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.46995: when evaluation is False, skipping this task 51243 1727204723.46998: _execute() done 51243 1727204723.47001: dumping result to json 51243 1727204723.47004: done dumping result, returning 51243 1727204723.47017: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-0000000000b6] 51243 1727204723.47023: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b6 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.47352: no more pending results, returning what we have 51243 1727204723.47356: results queue empty 51243 1727204723.47357: checking for any_errors_fatal 51243 1727204723.47369: done checking for any_errors_fatal 51243 1727204723.47370: checking for max_fail_percentage 51243 1727204723.47371: done checking for max_fail_percentage 51243 1727204723.47373: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.47373: done checking to see if all hosts have failed 51243 1727204723.47374: getting the remaining hosts for this loop 51243 1727204723.47376: done getting the remaining hosts for this loop 51243 1727204723.47380: getting the next task for host managed-node3 51243 1727204723.47388: done getting next task for host managed-node3 51243 1727204723.47391: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51243 1727204723.47395: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.47416: getting variables 51243 1727204723.47418: in VariableManager get_vars() 51243 1727204723.47675: Calling all_inventory to load vars for managed-node3 51243 1727204723.47678: Calling groups_inventory to load vars for managed-node3 51243 1727204723.47680: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.47690: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.47693: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.47696: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.48012: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b6 51243 1727204723.48016: WORKER PROCESS EXITING 51243 1727204723.48041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.48569: done with get_vars() 51243 1727204723.48582: done getting variables 51243 1727204723.48639: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.049) 0:00:06.096 ***** 51243 1727204723.48877: entering _queue_task() for managed-node3/fail 51243 1727204723.49332: worker is 1 (out of 1 available) 51243 1727204723.49348: exiting _queue_task() for managed-node3/fail 51243 1727204723.49362: done queuing things up, now waiting for results queue to drain 51243 1727204723.49364: waiting for pending results... 51243 1727204723.49931: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51243 1727204723.50185: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b7 51243 1727204723.50201: variable 'ansible_search_path' from source: unknown 51243 1727204723.50204: variable 'ansible_search_path' from source: unknown 51243 1727204723.50364: calling self._execute() 51243 1727204723.50541: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.50664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.50676: variable 'omit' from source: magic vars 51243 1727204723.51563: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.51649: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.51905: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.51909: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.51912: when evaluation is False, skipping this task 51243 1727204723.51915: _execute() done 51243 1727204723.51920: dumping result to json 51243 1727204723.51923: done dumping result, returning 51243 1727204723.51933: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-0000000000b7] 51243 1727204723.51943: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b7 51243 1727204723.52173: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b7 51243 1727204723.52176: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.52428: no more pending results, returning what we have 51243 1727204723.52432: results queue empty 51243 1727204723.52433: checking for any_errors_fatal 51243 1727204723.52442: done checking for any_errors_fatal 51243 1727204723.52444: checking for max_fail_percentage 51243 1727204723.52446: done checking for max_fail_percentage 51243 1727204723.52447: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.52448: done checking to see if all hosts have failed 51243 1727204723.52449: getting the remaining hosts for this loop 51243 1727204723.52451: done getting the remaining hosts for this loop 51243 1727204723.52456: getting the next task for host managed-node3 51243 1727204723.52464: done getting next task for host managed-node3 51243 1727204723.52470: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 51243 1727204723.52474: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.52494: getting variables 51243 1727204723.52495: in VariableManager get_vars() 51243 1727204723.52543: Calling all_inventory to load vars for managed-node3 51243 1727204723.52545: Calling groups_inventory to load vars for managed-node3 51243 1727204723.52547: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.52557: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.52561: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.52564: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.53129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.53892: done with get_vars() 51243 1727204723.53906: done getting variables 51243 1727204723.54178: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.053) 0:00:06.149 ***** 51243 1727204723.54216: entering _queue_task() for managed-node3/package 51243 1727204723.54763: worker is 1 (out of 1 available) 51243 1727204723.55081: exiting _queue_task() for managed-node3/package 51243 1727204723.55093: done queuing things up, now waiting for results queue to drain 51243 1727204723.55095: waiting for pending results... 51243 1727204723.55688: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 51243 1727204723.55697: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b8 51243 1727204723.55702: variable 'ansible_search_path' from source: unknown 51243 1727204723.55704: variable 'ansible_search_path' from source: unknown 51243 1727204723.55916: calling self._execute() 51243 1727204723.56024: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.56028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.56242: variable 'omit' from source: magic vars 51243 1727204723.57581: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.57594: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.57744: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.58084: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.58088: when evaluation is False, skipping this task 51243 1727204723.58091: _execute() done 51243 1727204723.58094: dumping result to json 51243 1727204723.58096: done dumping result, returning 51243 1727204723.58105: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-5c5d-847b-0000000000b8] 51243 1727204723.58112: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b8 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.58405: no more pending results, returning what we have 51243 1727204723.58409: results queue empty 51243 1727204723.58410: checking for any_errors_fatal 51243 1727204723.58422: done checking for any_errors_fatal 51243 1727204723.58423: checking for max_fail_percentage 51243 1727204723.58425: done checking for max_fail_percentage 51243 1727204723.58426: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.58428: done checking to see if all hosts have failed 51243 1727204723.58429: getting the remaining hosts for this loop 51243 1727204723.58431: done getting the remaining hosts for this loop 51243 1727204723.58435: getting the next task for host managed-node3 51243 1727204723.58443: done getting next task for host managed-node3 51243 1727204723.58448: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51243 1727204723.58451: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.58478: getting variables 51243 1727204723.58480: in VariableManager get_vars() 51243 1727204723.58536: Calling all_inventory to load vars for managed-node3 51243 1727204723.58540: Calling groups_inventory to load vars for managed-node3 51243 1727204723.58542: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.58556: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.58561: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.58565: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.58880: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b8 51243 1727204723.58884: WORKER PROCESS EXITING 51243 1727204723.59322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.59829: done with get_vars() 51243 1727204723.59845: done getting variables 51243 1727204723.59908: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.057) 0:00:06.207 ***** 51243 1727204723.59943: entering _queue_task() for managed-node3/package 51243 1727204723.60887: worker is 1 (out of 1 available) 51243 1727204723.60901: exiting _queue_task() for managed-node3/package 51243 1727204723.60915: done queuing things up, now waiting for results queue to drain 51243 1727204723.60917: waiting for pending results... 51243 1727204723.61457: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51243 1727204723.61945: in run() - task 127b8e07-fff9-5c5d-847b-0000000000b9 51243 1727204723.62014: variable 'ansible_search_path' from source: unknown 51243 1727204723.62025: variable 'ansible_search_path' from source: unknown 51243 1727204723.62210: calling self._execute() 51243 1727204723.62423: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.62442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.62485: variable 'omit' from source: magic vars 51243 1727204723.63553: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.63558: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.63823: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.63848: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.63878: when evaluation is False, skipping this task 51243 1727204723.63888: _execute() done 51243 1727204723.63899: dumping result to json 51243 1727204723.64010: done dumping result, returning 51243 1727204723.64014: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-5c5d-847b-0000000000b9] 51243 1727204723.64024: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b9 51243 1727204723.64216: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000b9 51243 1727204723.64219: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.64295: no more pending results, returning what we have 51243 1727204723.64299: results queue empty 51243 1727204723.64300: checking for any_errors_fatal 51243 1727204723.64308: done checking for any_errors_fatal 51243 1727204723.64309: checking for max_fail_percentage 51243 1727204723.64311: done checking for max_fail_percentage 51243 1727204723.64312: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.64313: done checking to see if all hosts have failed 51243 1727204723.64314: getting the remaining hosts for this loop 51243 1727204723.64316: done getting the remaining hosts for this loop 51243 1727204723.64320: getting the next task for host managed-node3 51243 1727204723.64328: done getting next task for host managed-node3 51243 1727204723.64332: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51243 1727204723.64336: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.64359: getting variables 51243 1727204723.64361: in VariableManager get_vars() 51243 1727204723.64620: Calling all_inventory to load vars for managed-node3 51243 1727204723.64623: Calling groups_inventory to load vars for managed-node3 51243 1727204723.64625: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.64637: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.64640: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.64643: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.65228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.65832: done with get_vars() 51243 1727204723.65847: done getting variables 51243 1727204723.66132: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.062) 0:00:06.269 ***** 51243 1727204723.66173: entering _queue_task() for managed-node3/package 51243 1727204723.66551: worker is 1 (out of 1 available) 51243 1727204723.66848: exiting _queue_task() for managed-node3/package 51243 1727204723.66861: done queuing things up, now waiting for results queue to drain 51243 1727204723.66862: waiting for pending results... 51243 1727204723.67189: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51243 1727204723.67549: in run() - task 127b8e07-fff9-5c5d-847b-0000000000ba 51243 1727204723.67554: variable 'ansible_search_path' from source: unknown 51243 1727204723.67556: variable 'ansible_search_path' from source: unknown 51243 1727204723.67787: calling self._execute() 51243 1727204723.68012: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.68039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.68085: variable 'omit' from source: magic vars 51243 1727204723.68693: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.68713: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.68937: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.68949: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.68957: when evaluation is False, skipping this task 51243 1727204723.68968: _execute() done 51243 1727204723.69028: dumping result to json 51243 1727204723.69037: done dumping result, returning 51243 1727204723.69040: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-5c5d-847b-0000000000ba] 51243 1727204723.69043: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000ba 51243 1727204723.69406: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000ba 51243 1727204723.69410: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.69611: no more pending results, returning what we have 51243 1727204723.69614: results queue empty 51243 1727204723.69616: checking for any_errors_fatal 51243 1727204723.69624: done checking for any_errors_fatal 51243 1727204723.69624: checking for max_fail_percentage 51243 1727204723.69626: done checking for max_fail_percentage 51243 1727204723.69627: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.69628: done checking to see if all hosts have failed 51243 1727204723.69629: getting the remaining hosts for this loop 51243 1727204723.69631: done getting the remaining hosts for this loop 51243 1727204723.69635: getting the next task for host managed-node3 51243 1727204723.69641: done getting next task for host managed-node3 51243 1727204723.69644: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51243 1727204723.69647: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.69668: getting variables 51243 1727204723.69669: in VariableManager get_vars() 51243 1727204723.69715: Calling all_inventory to load vars for managed-node3 51243 1727204723.69718: Calling groups_inventory to load vars for managed-node3 51243 1727204723.69720: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.69731: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.69733: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.69737: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.70328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.70822: done with get_vars() 51243 1727204723.70835: done getting variables 51243 1727204723.70923: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.047) 0:00:06.317 ***** 51243 1727204723.70957: entering _queue_task() for managed-node3/service 51243 1727204723.72404: worker is 1 (out of 1 available) 51243 1727204723.72417: exiting _queue_task() for managed-node3/service 51243 1727204723.72427: done queuing things up, now waiting for results queue to drain 51243 1727204723.72429: waiting for pending results... 51243 1727204723.72793: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51243 1727204723.73269: in run() - task 127b8e07-fff9-5c5d-847b-0000000000bb 51243 1727204723.73274: variable 'ansible_search_path' from source: unknown 51243 1727204723.73278: variable 'ansible_search_path' from source: unknown 51243 1727204723.73714: calling self._execute() 51243 1727204723.73736: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.73754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.73768: variable 'omit' from source: magic vars 51243 1727204723.74871: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.75001: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.75348: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.75355: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.75358: when evaluation is False, skipping this task 51243 1727204723.75361: _execute() done 51243 1727204723.75364: dumping result to json 51243 1727204723.75369: done dumping result, returning 51243 1727204723.75475: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-0000000000bb] 51243 1727204723.75479: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000bb 51243 1727204723.75563: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000bb 51243 1727204723.75568: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.75621: no more pending results, returning what we have 51243 1727204723.75625: results queue empty 51243 1727204723.75626: checking for any_errors_fatal 51243 1727204723.75634: done checking for any_errors_fatal 51243 1727204723.75635: checking for max_fail_percentage 51243 1727204723.75637: done checking for max_fail_percentage 51243 1727204723.75638: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.75639: done checking to see if all hosts have failed 51243 1727204723.75641: getting the remaining hosts for this loop 51243 1727204723.75643: done getting the remaining hosts for this loop 51243 1727204723.75648: getting the next task for host managed-node3 51243 1727204723.75657: done getting next task for host managed-node3 51243 1727204723.75661: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51243 1727204723.75667: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.75695: getting variables 51243 1727204723.75697: in VariableManager get_vars() 51243 1727204723.75755: Calling all_inventory to load vars for managed-node3 51243 1727204723.75758: Calling groups_inventory to load vars for managed-node3 51243 1727204723.75760: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.75995: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.75999: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.76004: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.77103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.78081: done with get_vars() 51243 1727204723.78098: done getting variables 51243 1727204723.78164: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.076) 0:00:06.393 ***** 51243 1727204723.78612: entering _queue_task() for managed-node3/service 51243 1727204723.79603: worker is 1 (out of 1 available) 51243 1727204723.79618: exiting _queue_task() for managed-node3/service 51243 1727204723.79630: done queuing things up, now waiting for results queue to drain 51243 1727204723.79632: waiting for pending results... 51243 1727204723.80198: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51243 1727204723.80516: in run() - task 127b8e07-fff9-5c5d-847b-0000000000bc 51243 1727204723.80522: variable 'ansible_search_path' from source: unknown 51243 1727204723.80526: variable 'ansible_search_path' from source: unknown 51243 1727204723.80532: calling self._execute() 51243 1727204723.80683: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.80695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.80709: variable 'omit' from source: magic vars 51243 1727204723.81124: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.81146: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.81281: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.81372: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.81375: when evaluation is False, skipping this task 51243 1727204723.81385: _execute() done 51243 1727204723.81388: dumping result to json 51243 1727204723.81391: done dumping result, returning 51243 1727204723.81394: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-5c5d-847b-0000000000bc] 51243 1727204723.81396: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000bc 51243 1727204723.81578: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000bc 51243 1727204723.81581: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51243 1727204723.81638: no more pending results, returning what we have 51243 1727204723.81643: results queue empty 51243 1727204723.81644: checking for any_errors_fatal 51243 1727204723.81653: done checking for any_errors_fatal 51243 1727204723.81654: checking for max_fail_percentage 51243 1727204723.81656: done checking for max_fail_percentage 51243 1727204723.81658: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.81659: done checking to see if all hosts have failed 51243 1727204723.81660: getting the remaining hosts for this loop 51243 1727204723.81662: done getting the remaining hosts for this loop 51243 1727204723.81670: getting the next task for host managed-node3 51243 1727204723.81678: done getting next task for host managed-node3 51243 1727204723.81684: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51243 1727204723.81688: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.81711: getting variables 51243 1727204723.81713: in VariableManager get_vars() 51243 1727204723.81872: Calling all_inventory to load vars for managed-node3 51243 1727204723.81875: Calling groups_inventory to load vars for managed-node3 51243 1727204723.81878: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.81889: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.81891: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.81894: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.82236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.82523: done with get_vars() 51243 1727204723.82538: done getting variables 51243 1727204723.82606: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.040) 0:00:06.434 ***** 51243 1727204723.82643: entering _queue_task() for managed-node3/service 51243 1727204723.83063: worker is 1 (out of 1 available) 51243 1727204723.83151: exiting _queue_task() for managed-node3/service 51243 1727204723.83164: done queuing things up, now waiting for results queue to drain 51243 1727204723.83270: waiting for pending results... 51243 1727204723.83486: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51243 1727204723.83562: in run() - task 127b8e07-fff9-5c5d-847b-0000000000bd 51243 1727204723.83591: variable 'ansible_search_path' from source: unknown 51243 1727204723.83607: variable 'ansible_search_path' from source: unknown 51243 1727204723.83657: calling self._execute() 51243 1727204723.83764: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.83782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.83801: variable 'omit' from source: magic vars 51243 1727204723.84260: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.84264: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.84376: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.84388: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.84394: when evaluation is False, skipping this task 51243 1727204723.84401: _execute() done 51243 1727204723.84409: dumping result to json 51243 1727204723.84418: done dumping result, returning 51243 1727204723.84433: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-5c5d-847b-0000000000bd] 51243 1727204723.84473: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000bd skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.84721: no more pending results, returning what we have 51243 1727204723.84725: results queue empty 51243 1727204723.84726: checking for any_errors_fatal 51243 1727204723.84734: done checking for any_errors_fatal 51243 1727204723.84735: checking for max_fail_percentage 51243 1727204723.84737: done checking for max_fail_percentage 51243 1727204723.84738: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.84739: done checking to see if all hosts have failed 51243 1727204723.84739: getting the remaining hosts for this loop 51243 1727204723.84741: done getting the remaining hosts for this loop 51243 1727204723.84746: getting the next task for host managed-node3 51243 1727204723.84754: done getting next task for host managed-node3 51243 1727204723.84758: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 51243 1727204723.84762: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.84878: getting variables 51243 1727204723.84881: in VariableManager get_vars() 51243 1727204723.85088: Calling all_inventory to load vars for managed-node3 51243 1727204723.85091: Calling groups_inventory to load vars for managed-node3 51243 1727204723.85094: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.85105: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.85107: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.85111: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.85483: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000bd 51243 1727204723.85487: WORKER PROCESS EXITING 51243 1727204723.85511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.85775: done with get_vars() 51243 1727204723.85790: done getting variables 51243 1727204723.85852: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.032) 0:00:06.466 ***** 51243 1727204723.85891: entering _queue_task() for managed-node3/service 51243 1727204723.86219: worker is 1 (out of 1 available) 51243 1727204723.86234: exiting _queue_task() for managed-node3/service 51243 1727204723.86248: done queuing things up, now waiting for results queue to drain 51243 1727204723.86249: waiting for pending results... 51243 1727204723.86586: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 51243 1727204723.86825: in run() - task 127b8e07-fff9-5c5d-847b-0000000000be 51243 1727204723.86848: variable 'ansible_search_path' from source: unknown 51243 1727204723.86856: variable 'ansible_search_path' from source: unknown 51243 1727204723.86905: calling self._execute() 51243 1727204723.87002: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.87015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.87033: variable 'omit' from source: magic vars 51243 1727204723.87457: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.87482: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.87611: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.87624: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.87632: when evaluation is False, skipping this task 51243 1727204723.87671: _execute() done 51243 1727204723.87675: dumping result to json 51243 1727204723.87682: done dumping result, returning 51243 1727204723.87685: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-5c5d-847b-0000000000be] 51243 1727204723.87688: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000be skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51243 1727204723.87840: no more pending results, returning what we have 51243 1727204723.87845: results queue empty 51243 1727204723.87847: checking for any_errors_fatal 51243 1727204723.87857: done checking for any_errors_fatal 51243 1727204723.87858: checking for max_fail_percentage 51243 1727204723.87861: done checking for max_fail_percentage 51243 1727204723.87862: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.87863: done checking to see if all hosts have failed 51243 1727204723.87864: getting the remaining hosts for this loop 51243 1727204723.87867: done getting the remaining hosts for this loop 51243 1727204723.87872: getting the next task for host managed-node3 51243 1727204723.87879: done getting next task for host managed-node3 51243 1727204723.87884: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51243 1727204723.87887: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.87911: getting variables 51243 1727204723.87912: in VariableManager get_vars() 51243 1727204723.88258: Calling all_inventory to load vars for managed-node3 51243 1727204723.88261: Calling groups_inventory to load vars for managed-node3 51243 1727204723.88265: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.88273: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000be 51243 1727204723.88276: WORKER PROCESS EXITING 51243 1727204723.88286: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.88289: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.88293: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.88547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.88805: done with get_vars() 51243 1727204723.88819: done getting variables 51243 1727204723.88886: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.030) 0:00:06.496 ***** 51243 1727204723.88921: entering _queue_task() for managed-node3/copy 51243 1727204723.89393: worker is 1 (out of 1 available) 51243 1727204723.89406: exiting _queue_task() for managed-node3/copy 51243 1727204723.89424: done queuing things up, now waiting for results queue to drain 51243 1727204723.89446: waiting for pending results... 51243 1727204723.89672: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51243 1727204723.89697: in run() - task 127b8e07-fff9-5c5d-847b-0000000000bf 51243 1727204723.89723: variable 'ansible_search_path' from source: unknown 51243 1727204723.89727: variable 'ansible_search_path' from source: unknown 51243 1727204723.89775: calling self._execute() 51243 1727204723.89887: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.90072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.90078: variable 'omit' from source: magic vars 51243 1727204723.90406: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.90421: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.90593: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.90600: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.90603: when evaluation is False, skipping this task 51243 1727204723.90606: _execute() done 51243 1727204723.90608: dumping result to json 51243 1727204723.90614: done dumping result, returning 51243 1727204723.90622: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-5c5d-847b-0000000000bf] 51243 1727204723.90627: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000bf skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.90897: no more pending results, returning what we have 51243 1727204723.90901: results queue empty 51243 1727204723.90902: checking for any_errors_fatal 51243 1727204723.90907: done checking for any_errors_fatal 51243 1727204723.90908: checking for max_fail_percentage 51243 1727204723.90910: done checking for max_fail_percentage 51243 1727204723.90910: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.90911: done checking to see if all hosts have failed 51243 1727204723.90912: getting the remaining hosts for this loop 51243 1727204723.90914: done getting the remaining hosts for this loop 51243 1727204723.90917: getting the next task for host managed-node3 51243 1727204723.90923: done getting next task for host managed-node3 51243 1727204723.90926: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51243 1727204723.90929: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.90949: getting variables 51243 1727204723.90951: in VariableManager get_vars() 51243 1727204723.90999: Calling all_inventory to load vars for managed-node3 51243 1727204723.91001: Calling groups_inventory to load vars for managed-node3 51243 1727204723.91011: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.91017: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000bf 51243 1727204723.91019: WORKER PROCESS EXITING 51243 1727204723.91028: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.91031: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.91036: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.91253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.91496: done with get_vars() 51243 1727204723.91509: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.026) 0:00:06.523 ***** 51243 1727204723.91607: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 51243 1727204723.91951: worker is 1 (out of 1 available) 51243 1727204723.91967: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 51243 1727204723.92173: done queuing things up, now waiting for results queue to drain 51243 1727204723.92176: waiting for pending results... 51243 1727204723.92272: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51243 1727204723.92524: in run() - task 127b8e07-fff9-5c5d-847b-0000000000c0 51243 1727204723.92597: variable 'ansible_search_path' from source: unknown 51243 1727204723.92610: variable 'ansible_search_path' from source: unknown 51243 1727204723.92660: calling self._execute() 51243 1727204723.92775: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.92787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.92806: variable 'omit' from source: magic vars 51243 1727204723.93705: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.93739: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.93850: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.93854: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.93858: when evaluation is False, skipping this task 51243 1727204723.93861: _execute() done 51243 1727204723.93863: dumping result to json 51243 1727204723.93960: done dumping result, returning 51243 1727204723.93964: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-5c5d-847b-0000000000c0] 51243 1727204723.93969: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c0 51243 1727204723.94043: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c0 51243 1727204723.94046: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.94143: no more pending results, returning what we have 51243 1727204723.94146: results queue empty 51243 1727204723.94146: checking for any_errors_fatal 51243 1727204723.94154: done checking for any_errors_fatal 51243 1727204723.94155: checking for max_fail_percentage 51243 1727204723.94156: done checking for max_fail_percentage 51243 1727204723.94157: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.94158: done checking to see if all hosts have failed 51243 1727204723.94158: getting the remaining hosts for this loop 51243 1727204723.94160: done getting the remaining hosts for this loop 51243 1727204723.94163: getting the next task for host managed-node3 51243 1727204723.94172: done getting next task for host managed-node3 51243 1727204723.94176: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 51243 1727204723.94180: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.94197: getting variables 51243 1727204723.94199: in VariableManager get_vars() 51243 1727204723.94245: Calling all_inventory to load vars for managed-node3 51243 1727204723.94248: Calling groups_inventory to load vars for managed-node3 51243 1727204723.94250: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.94259: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.94262: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.94291: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.94822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.95175: done with get_vars() 51243 1727204723.95200: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.036) 0:00:06.560 ***** 51243 1727204723.95305: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 51243 1727204723.95574: worker is 1 (out of 1 available) 51243 1727204723.95588: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 51243 1727204723.95602: done queuing things up, now waiting for results queue to drain 51243 1727204723.95603: waiting for pending results... 51243 1727204723.95786: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 51243 1727204723.95896: in run() - task 127b8e07-fff9-5c5d-847b-0000000000c1 51243 1727204723.95907: variable 'ansible_search_path' from source: unknown 51243 1727204723.95911: variable 'ansible_search_path' from source: unknown 51243 1727204723.95945: calling self._execute() 51243 1727204723.96017: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.96022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.96031: variable 'omit' from source: magic vars 51243 1727204723.96340: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.96349: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.96440: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.96443: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.96446: when evaluation is False, skipping this task 51243 1727204723.96449: _execute() done 51243 1727204723.96451: dumping result to json 51243 1727204723.96455: done dumping result, returning 51243 1727204723.96463: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-5c5d-847b-0000000000c1] 51243 1727204723.96471: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c1 51243 1727204723.96572: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c1 51243 1727204723.96575: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204723.96641: no more pending results, returning what we have 51243 1727204723.96645: results queue empty 51243 1727204723.96646: checking for any_errors_fatal 51243 1727204723.96656: done checking for any_errors_fatal 51243 1727204723.96657: checking for max_fail_percentage 51243 1727204723.96658: done checking for max_fail_percentage 51243 1727204723.96659: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.96660: done checking to see if all hosts have failed 51243 1727204723.96661: getting the remaining hosts for this loop 51243 1727204723.96662: done getting the remaining hosts for this loop 51243 1727204723.96670: getting the next task for host managed-node3 51243 1727204723.96677: done getting next task for host managed-node3 51243 1727204723.96681: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51243 1727204723.96684: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.96703: getting variables 51243 1727204723.96704: in VariableManager get_vars() 51243 1727204723.96750: Calling all_inventory to load vars for managed-node3 51243 1727204723.96752: Calling groups_inventory to load vars for managed-node3 51243 1727204723.96755: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.96764: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.96774: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.96778: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.96915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.97091: done with get_vars() 51243 1727204723.97101: done getting variables 51243 1727204723.97148: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.018) 0:00:06.579 ***** 51243 1727204723.97174: entering _queue_task() for managed-node3/debug 51243 1727204723.97479: worker is 1 (out of 1 available) 51243 1727204723.97493: exiting _queue_task() for managed-node3/debug 51243 1727204723.97506: done queuing things up, now waiting for results queue to drain 51243 1727204723.97508: waiting for pending results... 51243 1727204723.97896: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51243 1727204723.97987: in run() - task 127b8e07-fff9-5c5d-847b-0000000000c2 51243 1727204723.98013: variable 'ansible_search_path' from source: unknown 51243 1727204723.98021: variable 'ansible_search_path' from source: unknown 51243 1727204723.98075: calling self._execute() 51243 1727204723.98180: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204723.98212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204723.98215: variable 'omit' from source: magic vars 51243 1727204723.98632: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.98648: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204723.98734: variable 'ansible_distribution_major_version' from source: facts 51243 1727204723.98741: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204723.98745: when evaluation is False, skipping this task 51243 1727204723.98754: _execute() done 51243 1727204723.98758: dumping result to json 51243 1727204723.98762: done dumping result, returning 51243 1727204723.98770: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-5c5d-847b-0000000000c2] 51243 1727204723.98776: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c2 51243 1727204723.98878: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c2 51243 1727204723.98882: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204723.98931: no more pending results, returning what we have 51243 1727204723.98935: results queue empty 51243 1727204723.98936: checking for any_errors_fatal 51243 1727204723.98944: done checking for any_errors_fatal 51243 1727204723.98945: checking for max_fail_percentage 51243 1727204723.98947: done checking for max_fail_percentage 51243 1727204723.98948: checking to see if all hosts have failed and the running result is not ok 51243 1727204723.98949: done checking to see if all hosts have failed 51243 1727204723.98949: getting the remaining hosts for this loop 51243 1727204723.98951: done getting the remaining hosts for this loop 51243 1727204723.98955: getting the next task for host managed-node3 51243 1727204723.98962: done getting next task for host managed-node3 51243 1727204723.98969: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51243 1727204723.98972: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204723.98990: getting variables 51243 1727204723.98992: in VariableManager get_vars() 51243 1727204723.99037: Calling all_inventory to load vars for managed-node3 51243 1727204723.99039: Calling groups_inventory to load vars for managed-node3 51243 1727204723.99042: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204723.99051: Calling all_plugins_play to load vars for managed-node3 51243 1727204723.99053: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204723.99056: Calling groups_plugins_play to load vars for managed-node3 51243 1727204723.99216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204723.99368: done with get_vars() 51243 1727204723.99378: done getting variables 51243 1727204723.99429: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:23 -0400 (0:00:00.022) 0:00:06.602 ***** 51243 1727204723.99455: entering _queue_task() for managed-node3/debug 51243 1727204723.99707: worker is 1 (out of 1 available) 51243 1727204723.99722: exiting _queue_task() for managed-node3/debug 51243 1727204723.99735: done queuing things up, now waiting for results queue to drain 51243 1727204723.99737: waiting for pending results... 51243 1727204723.99914: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51243 1727204724.00010: in run() - task 127b8e07-fff9-5c5d-847b-0000000000c3 51243 1727204724.00023: variable 'ansible_search_path' from source: unknown 51243 1727204724.00026: variable 'ansible_search_path' from source: unknown 51243 1727204724.00065: calling self._execute() 51243 1727204724.00144: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.00148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.00158: variable 'omit' from source: magic vars 51243 1727204724.00470: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.00482: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.00574: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.00579: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.00582: when evaluation is False, skipping this task 51243 1727204724.00585: _execute() done 51243 1727204724.00588: dumping result to json 51243 1727204724.00591: done dumping result, returning 51243 1727204724.00600: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-5c5d-847b-0000000000c3] 51243 1727204724.00606: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c3 51243 1727204724.00701: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c3 51243 1727204724.00704: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204724.00779: no more pending results, returning what we have 51243 1727204724.00782: results queue empty 51243 1727204724.00783: checking for any_errors_fatal 51243 1727204724.00789: done checking for any_errors_fatal 51243 1727204724.00790: checking for max_fail_percentage 51243 1727204724.00791: done checking for max_fail_percentage 51243 1727204724.00792: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.00793: done checking to see if all hosts have failed 51243 1727204724.00794: getting the remaining hosts for this loop 51243 1727204724.00795: done getting the remaining hosts for this loop 51243 1727204724.00799: getting the next task for host managed-node3 51243 1727204724.00805: done getting next task for host managed-node3 51243 1727204724.00810: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51243 1727204724.00812: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204724.00833: getting variables 51243 1727204724.00834: in VariableManager get_vars() 51243 1727204724.00879: Calling all_inventory to load vars for managed-node3 51243 1727204724.00882: Calling groups_inventory to load vars for managed-node3 51243 1727204724.00884: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.00893: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.00896: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.00899: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.01084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.01229: done with get_vars() 51243 1727204724.01239: done getting variables 51243 1727204724.01297: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.018) 0:00:06.621 ***** 51243 1727204724.01329: entering _queue_task() for managed-node3/debug 51243 1727204724.01582: worker is 1 (out of 1 available) 51243 1727204724.01598: exiting _queue_task() for managed-node3/debug 51243 1727204724.01611: done queuing things up, now waiting for results queue to drain 51243 1727204724.01613: waiting for pending results... 51243 1727204724.01797: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51243 1727204724.01895: in run() - task 127b8e07-fff9-5c5d-847b-0000000000c4 51243 1727204724.01908: variable 'ansible_search_path' from source: unknown 51243 1727204724.01912: variable 'ansible_search_path' from source: unknown 51243 1727204724.01948: calling self._execute() 51243 1727204724.02022: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.02026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.02038: variable 'omit' from source: magic vars 51243 1727204724.02351: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.02361: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.02453: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.02457: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.02460: when evaluation is False, skipping this task 51243 1727204724.02463: _execute() done 51243 1727204724.02468: dumping result to json 51243 1727204724.02471: done dumping result, returning 51243 1727204724.02479: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-5c5d-847b-0000000000c4] 51243 1727204724.02485: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c4 51243 1727204724.02581: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c4 51243 1727204724.02584: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204724.02657: no more pending results, returning what we have 51243 1727204724.02661: results queue empty 51243 1727204724.02662: checking for any_errors_fatal 51243 1727204724.02671: done checking for any_errors_fatal 51243 1727204724.02672: checking for max_fail_percentage 51243 1727204724.02673: done checking for max_fail_percentage 51243 1727204724.02674: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.02675: done checking to see if all hosts have failed 51243 1727204724.02676: getting the remaining hosts for this loop 51243 1727204724.02677: done getting the remaining hosts for this loop 51243 1727204724.02682: getting the next task for host managed-node3 51243 1727204724.02688: done getting next task for host managed-node3 51243 1727204724.02692: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 51243 1727204724.02695: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204724.02715: getting variables 51243 1727204724.02716: in VariableManager get_vars() 51243 1727204724.02759: Calling all_inventory to load vars for managed-node3 51243 1727204724.02762: Calling groups_inventory to load vars for managed-node3 51243 1727204724.02764: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.02781: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.02784: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.02787: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.02928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.03088: done with get_vars() 51243 1727204724.03100: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.018) 0:00:06.639 ***** 51243 1727204724.03175: entering _queue_task() for managed-node3/ping 51243 1727204724.03425: worker is 1 (out of 1 available) 51243 1727204724.03441: exiting _queue_task() for managed-node3/ping 51243 1727204724.03452: done queuing things up, now waiting for results queue to drain 51243 1727204724.03454: waiting for pending results... 51243 1727204724.03645: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 51243 1727204724.03752: in run() - task 127b8e07-fff9-5c5d-847b-0000000000c5 51243 1727204724.03764: variable 'ansible_search_path' from source: unknown 51243 1727204724.03771: variable 'ansible_search_path' from source: unknown 51243 1727204724.03805: calling self._execute() 51243 1727204724.03877: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.03883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.03892: variable 'omit' from source: magic vars 51243 1727204724.04224: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.04234: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.04317: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.04321: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.04323: when evaluation is False, skipping this task 51243 1727204724.04326: _execute() done 51243 1727204724.04329: dumping result to json 51243 1727204724.04332: done dumping result, returning 51243 1727204724.04345: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-5c5d-847b-0000000000c5] 51243 1727204724.04352: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c5 51243 1727204724.04448: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000c5 51243 1727204724.04451: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.04505: no more pending results, returning what we have 51243 1727204724.04508: results queue empty 51243 1727204724.04509: checking for any_errors_fatal 51243 1727204724.04517: done checking for any_errors_fatal 51243 1727204724.04517: checking for max_fail_percentage 51243 1727204724.04519: done checking for max_fail_percentage 51243 1727204724.04520: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.04521: done checking to see if all hosts have failed 51243 1727204724.04521: getting the remaining hosts for this loop 51243 1727204724.04524: done getting the remaining hosts for this loop 51243 1727204724.04528: getting the next task for host managed-node3 51243 1727204724.04535: done getting next task for host managed-node3 51243 1727204724.04537: ^ task is: TASK: meta (role_complete) 51243 1727204724.04540: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204724.04560: getting variables 51243 1727204724.04562: in VariableManager get_vars() 51243 1727204724.04613: Calling all_inventory to load vars for managed-node3 51243 1727204724.04616: Calling groups_inventory to load vars for managed-node3 51243 1727204724.04618: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.04628: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.04631: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.04634: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.04818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.04961: done with get_vars() 51243 1727204724.04971: done getting variables 51243 1727204724.05033: done queuing things up, now waiting for results queue to drain 51243 1727204724.05035: results queue empty 51243 1727204724.05036: checking for any_errors_fatal 51243 1727204724.05038: done checking for any_errors_fatal 51243 1727204724.05038: checking for max_fail_percentage 51243 1727204724.05039: done checking for max_fail_percentage 51243 1727204724.05039: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.05040: done checking to see if all hosts have failed 51243 1727204724.05040: getting the remaining hosts for this loop 51243 1727204724.05041: done getting the remaining hosts for this loop 51243 1727204724.05043: getting the next task for host managed-node3 51243 1727204724.05047: done getting next task for host managed-node3 51243 1727204724.05049: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51243 1727204724.05051: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.05057: getting variables 51243 1727204724.05058: in VariableManager get_vars() 51243 1727204724.05074: Calling all_inventory to load vars for managed-node3 51243 1727204724.05076: Calling groups_inventory to load vars for managed-node3 51243 1727204724.05077: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.05081: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.05082: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.05084: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.05181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.05318: done with get_vars() 51243 1727204724.05324: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.022) 0:00:06.661 ***** 51243 1727204724.05383: entering _queue_task() for managed-node3/include_tasks 51243 1727204724.05624: worker is 1 (out of 1 available) 51243 1727204724.05637: exiting _queue_task() for managed-node3/include_tasks 51243 1727204724.05649: done queuing things up, now waiting for results queue to drain 51243 1727204724.05651: waiting for pending results... 51243 1727204724.05832: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 51243 1727204724.05931: in run() - task 127b8e07-fff9-5c5d-847b-0000000000fd 51243 1727204724.05945: variable 'ansible_search_path' from source: unknown 51243 1727204724.05949: variable 'ansible_search_path' from source: unknown 51243 1727204724.05982: calling self._execute() 51243 1727204724.06056: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.06061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.06072: variable 'omit' from source: magic vars 51243 1727204724.06375: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.06385: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.06525: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.06531: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.06536: when evaluation is False, skipping this task 51243 1727204724.06541: _execute() done 51243 1727204724.06544: dumping result to json 51243 1727204724.06547: done dumping result, returning 51243 1727204724.06550: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-5c5d-847b-0000000000fd] 51243 1727204724.06553: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000fd 51243 1727204724.06651: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000fd 51243 1727204724.06655: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.06712: no more pending results, returning what we have 51243 1727204724.06715: results queue empty 51243 1727204724.06716: checking for any_errors_fatal 51243 1727204724.06718: done checking for any_errors_fatal 51243 1727204724.06718: checking for max_fail_percentage 51243 1727204724.06720: done checking for max_fail_percentage 51243 1727204724.06720: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.06721: done checking to see if all hosts have failed 51243 1727204724.06722: getting the remaining hosts for this loop 51243 1727204724.06724: done getting the remaining hosts for this loop 51243 1727204724.06728: getting the next task for host managed-node3 51243 1727204724.06738: done getting next task for host managed-node3 51243 1727204724.06741: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 51243 1727204724.06745: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.06763: getting variables 51243 1727204724.06765: in VariableManager get_vars() 51243 1727204724.06807: Calling all_inventory to load vars for managed-node3 51243 1727204724.06810: Calling groups_inventory to load vars for managed-node3 51243 1727204724.06812: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.06822: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.06824: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.06827: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.07000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.07149: done with get_vars() 51243 1727204724.07157: done getting variables 51243 1727204724.07205: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.018) 0:00:06.680 ***** 51243 1727204724.07229: entering _queue_task() for managed-node3/debug 51243 1727204724.07470: worker is 1 (out of 1 available) 51243 1727204724.07486: exiting _queue_task() for managed-node3/debug 51243 1727204724.07498: done queuing things up, now waiting for results queue to drain 51243 1727204724.07500: waiting for pending results... 51243 1727204724.07679: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 51243 1727204724.07779: in run() - task 127b8e07-fff9-5c5d-847b-0000000000fe 51243 1727204724.07791: variable 'ansible_search_path' from source: unknown 51243 1727204724.07796: variable 'ansible_search_path' from source: unknown 51243 1727204724.07830: calling self._execute() 51243 1727204724.07901: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.07905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.07914: variable 'omit' from source: magic vars 51243 1727204724.08226: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.08238: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.08325: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.08329: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.08332: when evaluation is False, skipping this task 51243 1727204724.08338: _execute() done 51243 1727204724.08349: dumping result to json 51243 1727204724.08352: done dumping result, returning 51243 1727204724.08355: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-5c5d-847b-0000000000fe] 51243 1727204724.08373: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000fe 51243 1727204724.08472: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000fe 51243 1727204724.08475: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204724.08548: no more pending results, returning what we have 51243 1727204724.08551: results queue empty 51243 1727204724.08552: checking for any_errors_fatal 51243 1727204724.08558: done checking for any_errors_fatal 51243 1727204724.08559: checking for max_fail_percentage 51243 1727204724.08560: done checking for max_fail_percentage 51243 1727204724.08561: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.08562: done checking to see if all hosts have failed 51243 1727204724.08563: getting the remaining hosts for this loop 51243 1727204724.08564: done getting the remaining hosts for this loop 51243 1727204724.08570: getting the next task for host managed-node3 51243 1727204724.08577: done getting next task for host managed-node3 51243 1727204724.08582: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51243 1727204724.08586: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.08605: getting variables 51243 1727204724.08607: in VariableManager get_vars() 51243 1727204724.08652: Calling all_inventory to load vars for managed-node3 51243 1727204724.08655: Calling groups_inventory to load vars for managed-node3 51243 1727204724.08657: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.08674: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.08676: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.08679: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.08827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.09010: done with get_vars() 51243 1727204724.09018: done getting variables 51243 1727204724.09067: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.018) 0:00:06.698 ***** 51243 1727204724.09092: entering _queue_task() for managed-node3/fail 51243 1727204724.09338: worker is 1 (out of 1 available) 51243 1727204724.09352: exiting _queue_task() for managed-node3/fail 51243 1727204724.09364: done queuing things up, now waiting for results queue to drain 51243 1727204724.09367: waiting for pending results... 51243 1727204724.09620: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 51243 1727204724.09762: in run() - task 127b8e07-fff9-5c5d-847b-0000000000ff 51243 1727204724.09767: variable 'ansible_search_path' from source: unknown 51243 1727204724.09771: variable 'ansible_search_path' from source: unknown 51243 1727204724.09786: calling self._execute() 51243 1727204724.09930: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.09937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.09940: variable 'omit' from source: magic vars 51243 1727204724.10475: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.10480: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.10483: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.10485: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.10496: when evaluation is False, skipping this task 51243 1727204724.10505: _execute() done 51243 1727204724.10513: dumping result to json 51243 1727204724.10520: done dumping result, returning 51243 1727204724.10531: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-5c5d-847b-0000000000ff] 51243 1727204724.10547: sending task result for task 127b8e07-fff9-5c5d-847b-0000000000ff skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.10746: no more pending results, returning what we have 51243 1727204724.10751: results queue empty 51243 1727204724.10752: checking for any_errors_fatal 51243 1727204724.10759: done checking for any_errors_fatal 51243 1727204724.10760: checking for max_fail_percentage 51243 1727204724.10762: done checking for max_fail_percentage 51243 1727204724.10763: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.10764: done checking to see if all hosts have failed 51243 1727204724.10764: getting the remaining hosts for this loop 51243 1727204724.10769: done getting the remaining hosts for this loop 51243 1727204724.10774: getting the next task for host managed-node3 51243 1727204724.10784: done getting next task for host managed-node3 51243 1727204724.10789: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51243 1727204724.10794: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.10820: getting variables 51243 1727204724.10822: in VariableManager get_vars() 51243 1727204724.11098: Calling all_inventory to load vars for managed-node3 51243 1727204724.11102: Calling groups_inventory to load vars for managed-node3 51243 1727204724.11104: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.11116: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.11119: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.11123: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.11461: done sending task result for task 127b8e07-fff9-5c5d-847b-0000000000ff 51243 1727204724.11467: WORKER PROCESS EXITING 51243 1727204724.11494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.11788: done with get_vars() 51243 1727204724.11803: done getting variables 51243 1727204724.11885: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.028) 0:00:06.726 ***** 51243 1727204724.11925: entering _queue_task() for managed-node3/fail 51243 1727204724.12409: worker is 1 (out of 1 available) 51243 1727204724.12425: exiting _queue_task() for managed-node3/fail 51243 1727204724.12440: done queuing things up, now waiting for results queue to drain 51243 1727204724.12442: waiting for pending results... 51243 1727204724.12677: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 51243 1727204724.12868: in run() - task 127b8e07-fff9-5c5d-847b-000000000100 51243 1727204724.12893: variable 'ansible_search_path' from source: unknown 51243 1727204724.12907: variable 'ansible_search_path' from source: unknown 51243 1727204724.12969: calling self._execute() 51243 1727204724.13121: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.13125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.13128: variable 'omit' from source: magic vars 51243 1727204724.13613: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.13701: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.13782: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.13793: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.13800: when evaluation is False, skipping this task 51243 1727204724.13818: _execute() done 51243 1727204724.13831: dumping result to json 51243 1727204724.13844: done dumping result, returning 51243 1727204724.13858: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-5c5d-847b-000000000100] 51243 1727204724.13918: sending task result for task 127b8e07-fff9-5c5d-847b-000000000100 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.14073: no more pending results, returning what we have 51243 1727204724.14077: results queue empty 51243 1727204724.14078: checking for any_errors_fatal 51243 1727204724.14086: done checking for any_errors_fatal 51243 1727204724.14087: checking for max_fail_percentage 51243 1727204724.14088: done checking for max_fail_percentage 51243 1727204724.14089: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.14090: done checking to see if all hosts have failed 51243 1727204724.14091: getting the remaining hosts for this loop 51243 1727204724.14093: done getting the remaining hosts for this loop 51243 1727204724.14097: getting the next task for host managed-node3 51243 1727204724.14104: done getting next task for host managed-node3 51243 1727204724.14108: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51243 1727204724.14113: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.14136: getting variables 51243 1727204724.14138: in VariableManager get_vars() 51243 1727204724.14194: Calling all_inventory to load vars for managed-node3 51243 1727204724.14197: Calling groups_inventory to load vars for managed-node3 51243 1727204724.14199: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.14205: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000100 51243 1727204724.14208: WORKER PROCESS EXITING 51243 1727204724.14218: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.14221: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.14224: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.14436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.14591: done with get_vars() 51243 1727204724.14600: done getting variables 51243 1727204724.14646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.027) 0:00:06.754 ***** 51243 1727204724.14673: entering _queue_task() for managed-node3/fail 51243 1727204724.14906: worker is 1 (out of 1 available) 51243 1727204724.14920: exiting _queue_task() for managed-node3/fail 51243 1727204724.14935: done queuing things up, now waiting for results queue to drain 51243 1727204724.14937: waiting for pending results... 51243 1727204724.15114: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 51243 1727204724.15214: in run() - task 127b8e07-fff9-5c5d-847b-000000000101 51243 1727204724.15226: variable 'ansible_search_path' from source: unknown 51243 1727204724.15230: variable 'ansible_search_path' from source: unknown 51243 1727204724.15261: calling self._execute() 51243 1727204724.15349: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.15354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.15362: variable 'omit' from source: magic vars 51243 1727204724.15783: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.15790: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.15828: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.15832: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.15835: when evaluation is False, skipping this task 51243 1727204724.15841: _execute() done 51243 1727204724.15844: dumping result to json 51243 1727204724.15847: done dumping result, returning 51243 1727204724.15854: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-5c5d-847b-000000000101] 51243 1727204724.15859: sending task result for task 127b8e07-fff9-5c5d-847b-000000000101 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.16013: no more pending results, returning what we have 51243 1727204724.16017: results queue empty 51243 1727204724.16018: checking for any_errors_fatal 51243 1727204724.16025: done checking for any_errors_fatal 51243 1727204724.16025: checking for max_fail_percentage 51243 1727204724.16027: done checking for max_fail_percentage 51243 1727204724.16028: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.16028: done checking to see if all hosts have failed 51243 1727204724.16029: getting the remaining hosts for this loop 51243 1727204724.16031: done getting the remaining hosts for this loop 51243 1727204724.16035: getting the next task for host managed-node3 51243 1727204724.16042: done getting next task for host managed-node3 51243 1727204724.16046: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51243 1727204724.16049: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.16060: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000101 51243 1727204724.16062: WORKER PROCESS EXITING 51243 1727204724.16076: getting variables 51243 1727204724.16078: in VariableManager get_vars() 51243 1727204724.16127: Calling all_inventory to load vars for managed-node3 51243 1727204724.16129: Calling groups_inventory to load vars for managed-node3 51243 1727204724.16132: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.16141: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.16142: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.16145: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.16279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.16424: done with get_vars() 51243 1727204724.16435: done getting variables 51243 1727204724.16481: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.018) 0:00:06.772 ***** 51243 1727204724.16519: entering _queue_task() for managed-node3/dnf 51243 1727204724.16811: worker is 1 (out of 1 available) 51243 1727204724.16825: exiting _queue_task() for managed-node3/dnf 51243 1727204724.16838: done queuing things up, now waiting for results queue to drain 51243 1727204724.16840: waiting for pending results... 51243 1727204724.17151: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 51243 1727204724.17231: in run() - task 127b8e07-fff9-5c5d-847b-000000000102 51243 1727204724.17238: variable 'ansible_search_path' from source: unknown 51243 1727204724.17241: variable 'ansible_search_path' from source: unknown 51243 1727204724.17281: calling self._execute() 51243 1727204724.17352: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.17356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.17367: variable 'omit' from source: magic vars 51243 1727204724.17669: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.17682: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.17769: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.17773: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.17776: when evaluation is False, skipping this task 51243 1727204724.17780: _execute() done 51243 1727204724.17782: dumping result to json 51243 1727204724.17785: done dumping result, returning 51243 1727204724.17794: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-000000000102] 51243 1727204724.17800: sending task result for task 127b8e07-fff9-5c5d-847b-000000000102 51243 1727204724.17903: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000102 51243 1727204724.17906: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.17962: no more pending results, returning what we have 51243 1727204724.17968: results queue empty 51243 1727204724.17969: checking for any_errors_fatal 51243 1727204724.17974: done checking for any_errors_fatal 51243 1727204724.17975: checking for max_fail_percentage 51243 1727204724.17977: done checking for max_fail_percentage 51243 1727204724.17977: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.17978: done checking to see if all hosts have failed 51243 1727204724.17979: getting the remaining hosts for this loop 51243 1727204724.17981: done getting the remaining hosts for this loop 51243 1727204724.17985: getting the next task for host managed-node3 51243 1727204724.17993: done getting next task for host managed-node3 51243 1727204724.17997: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51243 1727204724.18000: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.18020: getting variables 51243 1727204724.18022: in VariableManager get_vars() 51243 1727204724.18078: Calling all_inventory to load vars for managed-node3 51243 1727204724.18081: Calling groups_inventory to load vars for managed-node3 51243 1727204724.18083: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.18093: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.18095: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.18098: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.18280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.18434: done with get_vars() 51243 1727204724.18444: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 51243 1727204724.18505: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.020) 0:00:06.793 ***** 51243 1727204724.18531: entering _queue_task() for managed-node3/yum 51243 1727204724.18795: worker is 1 (out of 1 available) 51243 1727204724.18811: exiting _queue_task() for managed-node3/yum 51243 1727204724.18824: done queuing things up, now waiting for results queue to drain 51243 1727204724.18826: waiting for pending results... 51243 1727204724.19012: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 51243 1727204724.19116: in run() - task 127b8e07-fff9-5c5d-847b-000000000103 51243 1727204724.19128: variable 'ansible_search_path' from source: unknown 51243 1727204724.19132: variable 'ansible_search_path' from source: unknown 51243 1727204724.19169: calling self._execute() 51243 1727204724.19240: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.19244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.19254: variable 'omit' from source: magic vars 51243 1727204724.19558: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.19571: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.19659: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.19663: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.19668: when evaluation is False, skipping this task 51243 1727204724.19672: _execute() done 51243 1727204724.19674: dumping result to json 51243 1727204724.19679: done dumping result, returning 51243 1727204724.19686: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-000000000103] 51243 1727204724.19692: sending task result for task 127b8e07-fff9-5c5d-847b-000000000103 51243 1727204724.19799: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000103 51243 1727204724.19803: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.19870: no more pending results, returning what we have 51243 1727204724.19874: results queue empty 51243 1727204724.19876: checking for any_errors_fatal 51243 1727204724.19881: done checking for any_errors_fatal 51243 1727204724.19882: checking for max_fail_percentage 51243 1727204724.19883: done checking for max_fail_percentage 51243 1727204724.19884: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.19885: done checking to see if all hosts have failed 51243 1727204724.19886: getting the remaining hosts for this loop 51243 1727204724.19887: done getting the remaining hosts for this loop 51243 1727204724.19891: getting the next task for host managed-node3 51243 1727204724.19898: done getting next task for host managed-node3 51243 1727204724.19902: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51243 1727204724.19905: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.19924: getting variables 51243 1727204724.19925: in VariableManager get_vars() 51243 1727204724.19979: Calling all_inventory to load vars for managed-node3 51243 1727204724.19982: Calling groups_inventory to load vars for managed-node3 51243 1727204724.19984: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.19994: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.19997: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.19999: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.20140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.20319: done with get_vars() 51243 1727204724.20328: done getting variables 51243 1727204724.20378: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.018) 0:00:06.811 ***** 51243 1727204724.20403: entering _queue_task() for managed-node3/fail 51243 1727204724.20650: worker is 1 (out of 1 available) 51243 1727204724.20668: exiting _queue_task() for managed-node3/fail 51243 1727204724.20680: done queuing things up, now waiting for results queue to drain 51243 1727204724.20682: waiting for pending results... 51243 1727204724.20857: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 51243 1727204724.20963: in run() - task 127b8e07-fff9-5c5d-847b-000000000104 51243 1727204724.20977: variable 'ansible_search_path' from source: unknown 51243 1727204724.20980: variable 'ansible_search_path' from source: unknown 51243 1727204724.21014: calling self._execute() 51243 1727204724.21087: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.21091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.21101: variable 'omit' from source: magic vars 51243 1727204724.21634: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.21640: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.21715: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.21728: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.21740: when evaluation is False, skipping this task 51243 1727204724.21748: _execute() done 51243 1727204724.21772: dumping result to json 51243 1727204724.21868: done dumping result, returning 51243 1727204724.21874: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-000000000104] 51243 1727204724.21878: sending task result for task 127b8e07-fff9-5c5d-847b-000000000104 51243 1727204724.21961: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000104 51243 1727204724.21967: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.22030: no more pending results, returning what we have 51243 1727204724.22037: results queue empty 51243 1727204724.22039: checking for any_errors_fatal 51243 1727204724.22046: done checking for any_errors_fatal 51243 1727204724.22047: checking for max_fail_percentage 51243 1727204724.22049: done checking for max_fail_percentage 51243 1727204724.22050: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.22051: done checking to see if all hosts have failed 51243 1727204724.22052: getting the remaining hosts for this loop 51243 1727204724.22054: done getting the remaining hosts for this loop 51243 1727204724.22058: getting the next task for host managed-node3 51243 1727204724.22170: done getting next task for host managed-node3 51243 1727204724.22176: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 51243 1727204724.22182: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.22212: getting variables 51243 1727204724.22214: in VariableManager get_vars() 51243 1727204724.22333: Calling all_inventory to load vars for managed-node3 51243 1727204724.22337: Calling groups_inventory to load vars for managed-node3 51243 1727204724.22339: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.22353: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.22355: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.22358: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.22551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.22732: done with get_vars() 51243 1727204724.22747: done getting variables 51243 1727204724.22796: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.024) 0:00:06.836 ***** 51243 1727204724.22827: entering _queue_task() for managed-node3/package 51243 1727204724.23098: worker is 1 (out of 1 available) 51243 1727204724.23113: exiting _queue_task() for managed-node3/package 51243 1727204724.23127: done queuing things up, now waiting for results queue to drain 51243 1727204724.23128: waiting for pending results... 51243 1727204724.23493: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 51243 1727204724.23500: in run() - task 127b8e07-fff9-5c5d-847b-000000000105 51243 1727204724.23503: variable 'ansible_search_path' from source: unknown 51243 1727204724.23505: variable 'ansible_search_path' from source: unknown 51243 1727204724.23521: calling self._execute() 51243 1727204724.23608: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.23615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.23624: variable 'omit' from source: magic vars 51243 1727204724.23994: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.24006: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.24120: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.24137: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.24140: when evaluation is False, skipping this task 51243 1727204724.24144: _execute() done 51243 1727204724.24146: dumping result to json 51243 1727204724.24149: done dumping result, returning 51243 1727204724.24152: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-5c5d-847b-000000000105] 51243 1727204724.24154: sending task result for task 127b8e07-fff9-5c5d-847b-000000000105 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.24422: no more pending results, returning what we have 51243 1727204724.24426: results queue empty 51243 1727204724.24427: checking for any_errors_fatal 51243 1727204724.24434: done checking for any_errors_fatal 51243 1727204724.24435: checking for max_fail_percentage 51243 1727204724.24437: done checking for max_fail_percentage 51243 1727204724.24437: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.24438: done checking to see if all hosts have failed 51243 1727204724.24439: getting the remaining hosts for this loop 51243 1727204724.24440: done getting the remaining hosts for this loop 51243 1727204724.24443: getting the next task for host managed-node3 51243 1727204724.24452: done getting next task for host managed-node3 51243 1727204724.24456: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51243 1727204724.24460: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.24485: getting variables 51243 1727204724.24487: in VariableManager get_vars() 51243 1727204724.24531: Calling all_inventory to load vars for managed-node3 51243 1727204724.24536: Calling groups_inventory to load vars for managed-node3 51243 1727204724.24538: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.24547: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.24550: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.24553: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.24782: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000105 51243 1727204724.24785: WORKER PROCESS EXITING 51243 1727204724.24815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.25058: done with get_vars() 51243 1727204724.25070: done getting variables 51243 1727204724.25135: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.023) 0:00:06.859 ***** 51243 1727204724.25169: entering _queue_task() for managed-node3/package 51243 1727204724.25599: worker is 1 (out of 1 available) 51243 1727204724.25613: exiting _queue_task() for managed-node3/package 51243 1727204724.25626: done queuing things up, now waiting for results queue to drain 51243 1727204724.25628: waiting for pending results... 51243 1727204724.25876: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 51243 1727204724.26277: in run() - task 127b8e07-fff9-5c5d-847b-000000000106 51243 1727204724.26281: variable 'ansible_search_path' from source: unknown 51243 1727204724.26283: variable 'ansible_search_path' from source: unknown 51243 1727204724.26285: calling self._execute() 51243 1727204724.26418: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.26425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.26438: variable 'omit' from source: magic vars 51243 1727204724.26826: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.26839: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.26958: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.26964: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.26969: when evaluation is False, skipping this task 51243 1727204724.26972: _execute() done 51243 1727204724.26975: dumping result to json 51243 1727204724.26978: done dumping result, returning 51243 1727204724.26988: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-5c5d-847b-000000000106] 51243 1727204724.26994: sending task result for task 127b8e07-fff9-5c5d-847b-000000000106 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.27198: no more pending results, returning what we have 51243 1727204724.27202: results queue empty 51243 1727204724.27203: checking for any_errors_fatal 51243 1727204724.27209: done checking for any_errors_fatal 51243 1727204724.27210: checking for max_fail_percentage 51243 1727204724.27211: done checking for max_fail_percentage 51243 1727204724.27212: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.27213: done checking to see if all hosts have failed 51243 1727204724.27214: getting the remaining hosts for this loop 51243 1727204724.27216: done getting the remaining hosts for this loop 51243 1727204724.27220: getting the next task for host managed-node3 51243 1727204724.27226: done getting next task for host managed-node3 51243 1727204724.27229: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51243 1727204724.27235: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.27259: getting variables 51243 1727204724.27260: in VariableManager get_vars() 51243 1727204724.27307: Calling all_inventory to load vars for managed-node3 51243 1727204724.27310: Calling groups_inventory to load vars for managed-node3 51243 1727204724.27312: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.27319: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000106 51243 1727204724.27322: WORKER PROCESS EXITING 51243 1727204724.27331: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.27337: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.27340: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.27561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.27837: done with get_vars() 51243 1727204724.27852: done getting variables 51243 1727204724.27925: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.027) 0:00:06.887 ***** 51243 1727204724.27968: entering _queue_task() for managed-node3/package 51243 1727204724.28329: worker is 1 (out of 1 available) 51243 1727204724.28352: exiting _queue_task() for managed-node3/package 51243 1727204724.28369: done queuing things up, now waiting for results queue to drain 51243 1727204724.28371: waiting for pending results... 51243 1727204724.28671: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 51243 1727204724.28858: in run() - task 127b8e07-fff9-5c5d-847b-000000000107 51243 1727204724.28863: variable 'ansible_search_path' from source: unknown 51243 1727204724.28870: variable 'ansible_search_path' from source: unknown 51243 1727204724.28874: calling self._execute() 51243 1727204724.28951: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.28956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.28969: variable 'omit' from source: magic vars 51243 1727204724.29354: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.29369: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.29546: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.29550: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.29553: when evaluation is False, skipping this task 51243 1727204724.29556: _execute() done 51243 1727204724.29559: dumping result to json 51243 1727204724.29561: done dumping result, returning 51243 1727204724.29564: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-5c5d-847b-000000000107] 51243 1727204724.29568: sending task result for task 127b8e07-fff9-5c5d-847b-000000000107 51243 1727204724.29760: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000107 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.29803: no more pending results, returning what we have 51243 1727204724.29806: results queue empty 51243 1727204724.29807: checking for any_errors_fatal 51243 1727204724.29812: done checking for any_errors_fatal 51243 1727204724.29813: checking for max_fail_percentage 51243 1727204724.29815: done checking for max_fail_percentage 51243 1727204724.29815: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.29816: done checking to see if all hosts have failed 51243 1727204724.29817: getting the remaining hosts for this loop 51243 1727204724.29818: done getting the remaining hosts for this loop 51243 1727204724.29822: getting the next task for host managed-node3 51243 1727204724.29829: done getting next task for host managed-node3 51243 1727204724.29836: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51243 1727204724.29839: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.29858: getting variables 51243 1727204724.29860: in VariableManager get_vars() 51243 1727204724.29917: Calling all_inventory to load vars for managed-node3 51243 1727204724.29920: Calling groups_inventory to load vars for managed-node3 51243 1727204724.29922: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.29929: WORKER PROCESS EXITING 51243 1727204724.29941: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.29944: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.29947: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.30218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.30809: done with get_vars() 51243 1727204724.30824: done getting variables 51243 1727204724.30928: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.030) 0:00:06.917 ***** 51243 1727204724.30989: entering _queue_task() for managed-node3/service 51243 1727204724.31374: worker is 1 (out of 1 available) 51243 1727204724.31388: exiting _queue_task() for managed-node3/service 51243 1727204724.31401: done queuing things up, now waiting for results queue to drain 51243 1727204724.31402: waiting for pending results... 51243 1727204724.31963: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 51243 1727204724.32147: in run() - task 127b8e07-fff9-5c5d-847b-000000000108 51243 1727204724.32217: variable 'ansible_search_path' from source: unknown 51243 1727204724.32221: variable 'ansible_search_path' from source: unknown 51243 1727204724.32274: calling self._execute() 51243 1727204724.32438: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.32442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.32444: variable 'omit' from source: magic vars 51243 1727204724.32859: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.32890: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.33084: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.33090: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.33096: when evaluation is False, skipping this task 51243 1727204724.33099: _execute() done 51243 1727204724.33102: dumping result to json 51243 1727204724.33104: done dumping result, returning 51243 1727204724.33107: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-5c5d-847b-000000000108] 51243 1727204724.33109: sending task result for task 127b8e07-fff9-5c5d-847b-000000000108 51243 1727204724.33432: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000108 51243 1727204724.33438: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.33495: no more pending results, returning what we have 51243 1727204724.33499: results queue empty 51243 1727204724.33500: checking for any_errors_fatal 51243 1727204724.33507: done checking for any_errors_fatal 51243 1727204724.33508: checking for max_fail_percentage 51243 1727204724.33510: done checking for max_fail_percentage 51243 1727204724.33511: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.33512: done checking to see if all hosts have failed 51243 1727204724.33513: getting the remaining hosts for this loop 51243 1727204724.33515: done getting the remaining hosts for this loop 51243 1727204724.33519: getting the next task for host managed-node3 51243 1727204724.33528: done getting next task for host managed-node3 51243 1727204724.33532: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51243 1727204724.33539: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.33570: getting variables 51243 1727204724.33573: in VariableManager get_vars() 51243 1727204724.33635: Calling all_inventory to load vars for managed-node3 51243 1727204724.33638: Calling groups_inventory to load vars for managed-node3 51243 1727204724.33641: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.33771: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.33777: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.33782: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.34044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.34306: done with get_vars() 51243 1727204724.34321: done getting variables 51243 1727204724.34405: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.034) 0:00:06.952 ***** 51243 1727204724.34452: entering _queue_task() for managed-node3/service 51243 1727204724.34896: worker is 1 (out of 1 available) 51243 1727204724.34908: exiting _queue_task() for managed-node3/service 51243 1727204724.34920: done queuing things up, now waiting for results queue to drain 51243 1727204724.34921: waiting for pending results... 51243 1727204724.35180: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 51243 1727204724.35324: in run() - task 127b8e07-fff9-5c5d-847b-000000000109 51243 1727204724.35418: variable 'ansible_search_path' from source: unknown 51243 1727204724.35422: variable 'ansible_search_path' from source: unknown 51243 1727204724.35424: calling self._execute() 51243 1727204724.35507: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.35524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.35545: variable 'omit' from source: magic vars 51243 1727204724.36082: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.36102: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.36244: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.36255: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.36293: when evaluation is False, skipping this task 51243 1727204724.36297: _execute() done 51243 1727204724.36299: dumping result to json 51243 1727204724.36301: done dumping result, returning 51243 1727204724.36304: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-5c5d-847b-000000000109] 51243 1727204724.36308: sending task result for task 127b8e07-fff9-5c5d-847b-000000000109 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51243 1727204724.36618: no more pending results, returning what we have 51243 1727204724.36622: results queue empty 51243 1727204724.36624: checking for any_errors_fatal 51243 1727204724.36631: done checking for any_errors_fatal 51243 1727204724.36635: checking for max_fail_percentage 51243 1727204724.36637: done checking for max_fail_percentage 51243 1727204724.36638: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.36639: done checking to see if all hosts have failed 51243 1727204724.36639: getting the remaining hosts for this loop 51243 1727204724.36641: done getting the remaining hosts for this loop 51243 1727204724.36647: getting the next task for host managed-node3 51243 1727204724.36656: done getting next task for host managed-node3 51243 1727204724.36661: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51243 1727204724.36667: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.36690: getting variables 51243 1727204724.36692: in VariableManager get_vars() 51243 1727204724.36749: Calling all_inventory to load vars for managed-node3 51243 1727204724.36752: Calling groups_inventory to load vars for managed-node3 51243 1727204724.36755: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.36884: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.36889: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.36895: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000109 51243 1727204724.36898: WORKER PROCESS EXITING 51243 1727204724.36902: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.37259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.37536: done with get_vars() 51243 1727204724.37550: done getting variables 51243 1727204724.37616: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.032) 0:00:06.984 ***** 51243 1727204724.37661: entering _queue_task() for managed-node3/service 51243 1727204724.38024: worker is 1 (out of 1 available) 51243 1727204724.38040: exiting _queue_task() for managed-node3/service 51243 1727204724.38053: done queuing things up, now waiting for results queue to drain 51243 1727204724.38055: waiting for pending results... 51243 1727204724.38369: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 51243 1727204724.38555: in run() - task 127b8e07-fff9-5c5d-847b-00000000010a 51243 1727204724.38585: variable 'ansible_search_path' from source: unknown 51243 1727204724.38598: variable 'ansible_search_path' from source: unknown 51243 1727204724.38671: calling self._execute() 51243 1727204724.38758: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.38836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.38839: variable 'omit' from source: magic vars 51243 1727204724.40072: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.40102: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.40724: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.40746: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.40755: when evaluation is False, skipping this task 51243 1727204724.40763: _execute() done 51243 1727204724.40774: dumping result to json 51243 1727204724.40782: done dumping result, returning 51243 1727204724.40796: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-5c5d-847b-00000000010a] 51243 1727204724.40807: sending task result for task 127b8e07-fff9-5c5d-847b-00000000010a 51243 1727204724.41154: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000010a 51243 1727204724.41157: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.41217: no more pending results, returning what we have 51243 1727204724.41221: results queue empty 51243 1727204724.41223: checking for any_errors_fatal 51243 1727204724.41229: done checking for any_errors_fatal 51243 1727204724.41230: checking for max_fail_percentage 51243 1727204724.41232: done checking for max_fail_percentage 51243 1727204724.41233: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.41234: done checking to see if all hosts have failed 51243 1727204724.41235: getting the remaining hosts for this loop 51243 1727204724.41237: done getting the remaining hosts for this loop 51243 1727204724.41241: getting the next task for host managed-node3 51243 1727204724.41250: done getting next task for host managed-node3 51243 1727204724.41260: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 51243 1727204724.41267: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.41291: getting variables 51243 1727204724.41293: in VariableManager get_vars() 51243 1727204724.41350: Calling all_inventory to load vars for managed-node3 51243 1727204724.41354: Calling groups_inventory to load vars for managed-node3 51243 1727204724.41357: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.41876: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.41881: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.41886: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.42416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.43587: done with get_vars() 51243 1727204724.43602: done getting variables 51243 1727204724.43670: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.060) 0:00:07.044 ***** 51243 1727204724.43706: entering _queue_task() for managed-node3/service 51243 1727204724.44562: worker is 1 (out of 1 available) 51243 1727204724.44623: exiting _queue_task() for managed-node3/service 51243 1727204724.44635: done queuing things up, now waiting for results queue to drain 51243 1727204724.44637: waiting for pending results... 51243 1727204724.45473: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 51243 1727204724.45877: in run() - task 127b8e07-fff9-5c5d-847b-00000000010b 51243 1727204724.45892: variable 'ansible_search_path' from source: unknown 51243 1727204724.45896: variable 'ansible_search_path' from source: unknown 51243 1727204724.45936: calling self._execute() 51243 1727204724.46236: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.46331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.46335: variable 'omit' from source: magic vars 51243 1727204724.47233: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.47251: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.47705: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.47709: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.47712: when evaluation is False, skipping this task 51243 1727204724.47715: _execute() done 51243 1727204724.47718: dumping result to json 51243 1727204724.47720: done dumping result, returning 51243 1727204724.47836: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-5c5d-847b-00000000010b] 51243 1727204724.47839: sending task result for task 127b8e07-fff9-5c5d-847b-00000000010b 51243 1727204724.48494: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000010b 51243 1727204724.48498: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 51243 1727204724.48549: no more pending results, returning what we have 51243 1727204724.48553: results queue empty 51243 1727204724.48554: checking for any_errors_fatal 51243 1727204724.48570: done checking for any_errors_fatal 51243 1727204724.48571: checking for max_fail_percentage 51243 1727204724.48573: done checking for max_fail_percentage 51243 1727204724.48574: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.48575: done checking to see if all hosts have failed 51243 1727204724.48576: getting the remaining hosts for this loop 51243 1727204724.48578: done getting the remaining hosts for this loop 51243 1727204724.48582: getting the next task for host managed-node3 51243 1727204724.48591: done getting next task for host managed-node3 51243 1727204724.48596: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51243 1727204724.48601: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.48626: getting variables 51243 1727204724.48628: in VariableManager get_vars() 51243 1727204724.48985: Calling all_inventory to load vars for managed-node3 51243 1727204724.48989: Calling groups_inventory to load vars for managed-node3 51243 1727204724.48991: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.49002: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.49005: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.49008: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.49330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.50009: done with get_vars() 51243 1727204724.50025: done getting variables 51243 1727204724.50194: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.065) 0:00:07.110 ***** 51243 1727204724.50234: entering _queue_task() for managed-node3/copy 51243 1727204724.50902: worker is 1 (out of 1 available) 51243 1727204724.50914: exiting _queue_task() for managed-node3/copy 51243 1727204724.50928: done queuing things up, now waiting for results queue to drain 51243 1727204724.50930: waiting for pending results... 51243 1727204724.51263: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 51243 1727204724.51775: in run() - task 127b8e07-fff9-5c5d-847b-00000000010c 51243 1727204724.51779: variable 'ansible_search_path' from source: unknown 51243 1727204724.51782: variable 'ansible_search_path' from source: unknown 51243 1727204724.51784: calling self._execute() 51243 1727204724.51872: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.51940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.51956: variable 'omit' from source: magic vars 51243 1727204724.52704: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.52764: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.52951: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.52967: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.53028: when evaluation is False, skipping this task 51243 1727204724.53194: _execute() done 51243 1727204724.53197: dumping result to json 51243 1727204724.53200: done dumping result, returning 51243 1727204724.53203: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-5c5d-847b-00000000010c] 51243 1727204724.53205: sending task result for task 127b8e07-fff9-5c5d-847b-00000000010c 51243 1727204724.53494: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000010c 51243 1727204724.53498: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.53554: no more pending results, returning what we have 51243 1727204724.53558: results queue empty 51243 1727204724.53559: checking for any_errors_fatal 51243 1727204724.53768: done checking for any_errors_fatal 51243 1727204724.53770: checking for max_fail_percentage 51243 1727204724.53773: done checking for max_fail_percentage 51243 1727204724.53774: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.53775: done checking to see if all hosts have failed 51243 1727204724.53775: getting the remaining hosts for this loop 51243 1727204724.53777: done getting the remaining hosts for this loop 51243 1727204724.53782: getting the next task for host managed-node3 51243 1727204724.53789: done getting next task for host managed-node3 51243 1727204724.53794: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51243 1727204724.53799: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.53817: getting variables 51243 1727204724.53819: in VariableManager get_vars() 51243 1727204724.53934: Calling all_inventory to load vars for managed-node3 51243 1727204724.53938: Calling groups_inventory to load vars for managed-node3 51243 1727204724.53941: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.53951: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.53953: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.53956: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.54223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.54572: done with get_vars() 51243 1727204724.54587: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.047) 0:00:07.157 ***** 51243 1727204724.54968: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 51243 1727204724.55807: worker is 1 (out of 1 available) 51243 1727204724.55819: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 51243 1727204724.55831: done queuing things up, now waiting for results queue to drain 51243 1727204724.55832: waiting for pending results... 51243 1727204724.56147: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 51243 1727204724.56503: in run() - task 127b8e07-fff9-5c5d-847b-00000000010d 51243 1727204724.56507: variable 'ansible_search_path' from source: unknown 51243 1727204724.56510: variable 'ansible_search_path' from source: unknown 51243 1727204724.56513: calling self._execute() 51243 1727204724.56515: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.56518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.56771: variable 'omit' from source: magic vars 51243 1727204724.57847: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.57871: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.58161: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.58178: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.58186: when evaluation is False, skipping this task 51243 1727204724.58193: _execute() done 51243 1727204724.58201: dumping result to json 51243 1727204724.58209: done dumping result, returning 51243 1727204724.58230: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-5c5d-847b-00000000010d] 51243 1727204724.58240: sending task result for task 127b8e07-fff9-5c5d-847b-00000000010d skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.58438: no more pending results, returning what we have 51243 1727204724.58442: results queue empty 51243 1727204724.58443: checking for any_errors_fatal 51243 1727204724.58450: done checking for any_errors_fatal 51243 1727204724.58451: checking for max_fail_percentage 51243 1727204724.58453: done checking for max_fail_percentage 51243 1727204724.58454: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.58455: done checking to see if all hosts have failed 51243 1727204724.58455: getting the remaining hosts for this loop 51243 1727204724.58457: done getting the remaining hosts for this loop 51243 1727204724.58463: getting the next task for host managed-node3 51243 1727204724.58475: done getting next task for host managed-node3 51243 1727204724.58479: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 51243 1727204724.58486: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.58572: getting variables 51243 1727204724.58575: in VariableManager get_vars() 51243 1727204724.58631: Calling all_inventory to load vars for managed-node3 51243 1727204724.58634: Calling groups_inventory to load vars for managed-node3 51243 1727204724.58637: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.58652: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.58655: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.58659: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.59678: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000010d 51243 1727204724.59682: WORKER PROCESS EXITING 51243 1727204724.59701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.59961: done with get_vars() 51243 1727204724.59981: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.051) 0:00:07.208 ***** 51243 1727204724.60087: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 51243 1727204724.60427: worker is 1 (out of 1 available) 51243 1727204724.60441: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 51243 1727204724.60455: done queuing things up, now waiting for results queue to drain 51243 1727204724.60457: waiting for pending results... 51243 1727204724.60765: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 51243 1727204724.60932: in run() - task 127b8e07-fff9-5c5d-847b-00000000010e 51243 1727204724.60955: variable 'ansible_search_path' from source: unknown 51243 1727204724.60963: variable 'ansible_search_path' from source: unknown 51243 1727204724.61016: calling self._execute() 51243 1727204724.61114: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.61132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.61150: variable 'omit' from source: magic vars 51243 1727204724.61581: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.61608: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.61841: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.61853: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.61860: when evaluation is False, skipping this task 51243 1727204724.61870: _execute() done 51243 1727204724.61876: dumping result to json 51243 1727204724.61882: done dumping result, returning 51243 1727204724.61894: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-5c5d-847b-00000000010e] 51243 1727204724.61903: sending task result for task 127b8e07-fff9-5c5d-847b-00000000010e 51243 1727204724.62171: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000010e 51243 1727204724.62175: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.62231: no more pending results, returning what we have 51243 1727204724.62235: results queue empty 51243 1727204724.62236: checking for any_errors_fatal 51243 1727204724.62245: done checking for any_errors_fatal 51243 1727204724.62245: checking for max_fail_percentage 51243 1727204724.62247: done checking for max_fail_percentage 51243 1727204724.62248: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.62249: done checking to see if all hosts have failed 51243 1727204724.62250: getting the remaining hosts for this loop 51243 1727204724.62251: done getting the remaining hosts for this loop 51243 1727204724.62256: getting the next task for host managed-node3 51243 1727204724.62264: done getting next task for host managed-node3 51243 1727204724.62270: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51243 1727204724.62276: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.62303: getting variables 51243 1727204724.62305: in VariableManager get_vars() 51243 1727204724.62547: Calling all_inventory to load vars for managed-node3 51243 1727204724.62552: Calling groups_inventory to load vars for managed-node3 51243 1727204724.62556: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.62572: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.62575: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.62579: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.62855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.63116: done with get_vars() 51243 1727204724.63131: done getting variables 51243 1727204724.63205: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.031) 0:00:07.240 ***** 51243 1727204724.63246: entering _queue_task() for managed-node3/debug 51243 1727204724.63710: worker is 1 (out of 1 available) 51243 1727204724.63726: exiting _queue_task() for managed-node3/debug 51243 1727204724.63738: done queuing things up, now waiting for results queue to drain 51243 1727204724.63740: waiting for pending results... 51243 1727204724.63965: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 51243 1727204724.64125: in run() - task 127b8e07-fff9-5c5d-847b-00000000010f 51243 1727204724.64151: variable 'ansible_search_path' from source: unknown 51243 1727204724.64159: variable 'ansible_search_path' from source: unknown 51243 1727204724.64214: calling self._execute() 51243 1727204724.64322: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.64334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.64351: variable 'omit' from source: magic vars 51243 1727204724.64843: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.64847: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.64948: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.64964: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.64975: when evaluation is False, skipping this task 51243 1727204724.64983: _execute() done 51243 1727204724.64990: dumping result to json 51243 1727204724.64997: done dumping result, returning 51243 1727204724.65011: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-5c5d-847b-00000000010f] 51243 1727204724.65069: sending task result for task 127b8e07-fff9-5c5d-847b-00000000010f 51243 1727204724.65143: done sending task result for task 127b8e07-fff9-5c5d-847b-00000000010f 51243 1727204724.65146: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204724.65198: no more pending results, returning what we have 51243 1727204724.65202: results queue empty 51243 1727204724.65203: checking for any_errors_fatal 51243 1727204724.65209: done checking for any_errors_fatal 51243 1727204724.65210: checking for max_fail_percentage 51243 1727204724.65212: done checking for max_fail_percentage 51243 1727204724.65213: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.65214: done checking to see if all hosts have failed 51243 1727204724.65214: getting the remaining hosts for this loop 51243 1727204724.65217: done getting the remaining hosts for this loop 51243 1727204724.65221: getting the next task for host managed-node3 51243 1727204724.65229: done getting next task for host managed-node3 51243 1727204724.65233: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51243 1727204724.65238: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.65260: getting variables 51243 1727204724.65262: in VariableManager get_vars() 51243 1727204724.65325: Calling all_inventory to load vars for managed-node3 51243 1727204724.65329: Calling groups_inventory to load vars for managed-node3 51243 1727204724.65332: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.65347: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.65351: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.65354: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.65887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.66151: done with get_vars() 51243 1727204724.66167: done getting variables 51243 1727204724.66233: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.030) 0:00:07.270 ***** 51243 1727204724.66274: entering _queue_task() for managed-node3/debug 51243 1727204724.66635: worker is 1 (out of 1 available) 51243 1727204724.66648: exiting _queue_task() for managed-node3/debug 51243 1727204724.66662: done queuing things up, now waiting for results queue to drain 51243 1727204724.66664: waiting for pending results... 51243 1727204724.67087: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 51243 1727204724.67200: in run() - task 127b8e07-fff9-5c5d-847b-000000000110 51243 1727204724.67203: variable 'ansible_search_path' from source: unknown 51243 1727204724.67213: variable 'ansible_search_path' from source: unknown 51243 1727204724.67257: calling self._execute() 51243 1727204724.67441: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.67446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.67449: variable 'omit' from source: magic vars 51243 1727204724.67878: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.67895: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.68020: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.68033: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.68041: when evaluation is False, skipping this task 51243 1727204724.68049: _execute() done 51243 1727204724.68056: dumping result to json 51243 1727204724.68063: done dumping result, returning 51243 1727204724.68077: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-5c5d-847b-000000000110] 51243 1727204724.68091: sending task result for task 127b8e07-fff9-5c5d-847b-000000000110 skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204724.68333: no more pending results, returning what we have 51243 1727204724.68336: results queue empty 51243 1727204724.68338: checking for any_errors_fatal 51243 1727204724.68345: done checking for any_errors_fatal 51243 1727204724.68346: checking for max_fail_percentage 51243 1727204724.68348: done checking for max_fail_percentage 51243 1727204724.68349: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.68350: done checking to see if all hosts have failed 51243 1727204724.68351: getting the remaining hosts for this loop 51243 1727204724.68352: done getting the remaining hosts for this loop 51243 1727204724.68357: getting the next task for host managed-node3 51243 1727204724.68365: done getting next task for host managed-node3 51243 1727204724.68371: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51243 1727204724.68377: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.68401: getting variables 51243 1727204724.68404: in VariableManager get_vars() 51243 1727204724.68460: Calling all_inventory to load vars for managed-node3 51243 1727204724.68464: Calling groups_inventory to load vars for managed-node3 51243 1727204724.68634: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.68641: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000110 51243 1727204724.68644: WORKER PROCESS EXITING 51243 1727204724.68658: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.68853: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.68861: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.69122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.69355: done with get_vars() 51243 1727204724.69369: done getting variables 51243 1727204724.69432: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.031) 0:00:07.302 ***** 51243 1727204724.69469: entering _queue_task() for managed-node3/debug 51243 1727204724.69995: worker is 1 (out of 1 available) 51243 1727204724.70006: exiting _queue_task() for managed-node3/debug 51243 1727204724.70019: done queuing things up, now waiting for results queue to drain 51243 1727204724.70021: waiting for pending results... 51243 1727204724.70135: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 51243 1727204724.70284: in run() - task 127b8e07-fff9-5c5d-847b-000000000111 51243 1727204724.70310: variable 'ansible_search_path' from source: unknown 51243 1727204724.70319: variable 'ansible_search_path' from source: unknown 51243 1727204724.70371: calling self._execute() 51243 1727204724.70583: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.70587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.70589: variable 'omit' from source: magic vars 51243 1727204724.70905: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.70926: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.71046: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.71056: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.71062: when evaluation is False, skipping this task 51243 1727204724.71073: _execute() done 51243 1727204724.71081: dumping result to json 51243 1727204724.71087: done dumping result, returning 51243 1727204724.71101: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-5c5d-847b-000000000111] 51243 1727204724.71112: sending task result for task 127b8e07-fff9-5c5d-847b-000000000111 51243 1727204724.71312: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000111 51243 1727204724.71315: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 51243 1727204724.71395: no more pending results, returning what we have 51243 1727204724.71399: results queue empty 51243 1727204724.71400: checking for any_errors_fatal 51243 1727204724.71406: done checking for any_errors_fatal 51243 1727204724.71406: checking for max_fail_percentage 51243 1727204724.71408: done checking for max_fail_percentage 51243 1727204724.71409: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.71410: done checking to see if all hosts have failed 51243 1727204724.71411: getting the remaining hosts for this loop 51243 1727204724.71412: done getting the remaining hosts for this loop 51243 1727204724.71417: getting the next task for host managed-node3 51243 1727204724.71424: done getting next task for host managed-node3 51243 1727204724.71428: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 51243 1727204724.71434: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.71457: getting variables 51243 1727204724.71459: in VariableManager get_vars() 51243 1727204724.71516: Calling all_inventory to load vars for managed-node3 51243 1727204724.71519: Calling groups_inventory to load vars for managed-node3 51243 1727204724.71522: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.71537: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.71540: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.71543: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.71927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.72197: done with get_vars() 51243 1727204724.72211: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.028) 0:00:07.330 ***** 51243 1727204724.72317: entering _queue_task() for managed-node3/ping 51243 1727204724.72654: worker is 1 (out of 1 available) 51243 1727204724.72670: exiting _queue_task() for managed-node3/ping 51243 1727204724.72684: done queuing things up, now waiting for results queue to drain 51243 1727204724.72686: waiting for pending results... 51243 1727204724.73089: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 51243 1727204724.73140: in run() - task 127b8e07-fff9-5c5d-847b-000000000112 51243 1727204724.73162: variable 'ansible_search_path' from source: unknown 51243 1727204724.73174: variable 'ansible_search_path' from source: unknown 51243 1727204724.73224: calling self._execute() 51243 1727204724.73324: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.73337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.73352: variable 'omit' from source: magic vars 51243 1727204724.73769: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.73789: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.73923: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.73934: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.73945: when evaluation is False, skipping this task 51243 1727204724.73953: _execute() done 51243 1727204724.73960: dumping result to json 51243 1727204724.73968: done dumping result, returning 51243 1727204724.73980: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-5c5d-847b-000000000112] 51243 1727204724.73989: sending task result for task 127b8e07-fff9-5c5d-847b-000000000112 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.74223: no more pending results, returning what we have 51243 1727204724.74228: results queue empty 51243 1727204724.74229: checking for any_errors_fatal 51243 1727204724.74237: done checking for any_errors_fatal 51243 1727204724.74238: checking for max_fail_percentage 51243 1727204724.74240: done checking for max_fail_percentage 51243 1727204724.74241: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.74242: done checking to see if all hosts have failed 51243 1727204724.74243: getting the remaining hosts for this loop 51243 1727204724.74245: done getting the remaining hosts for this loop 51243 1727204724.74250: getting the next task for host managed-node3 51243 1727204724.74261: done getting next task for host managed-node3 51243 1727204724.74264: ^ task is: TASK: meta (role_complete) 51243 1727204724.74272: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.74298: getting variables 51243 1727204724.74300: in VariableManager get_vars() 51243 1727204724.74354: Calling all_inventory to load vars for managed-node3 51243 1727204724.74357: Calling groups_inventory to load vars for managed-node3 51243 1727204724.74360: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.74575: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.74579: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.74584: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.74814: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000112 51243 1727204724.74819: WORKER PROCESS EXITING 51243 1727204724.74844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.75108: done with get_vars() 51243 1727204724.75122: done getting variables 51243 1727204724.75216: done queuing things up, now waiting for results queue to drain 51243 1727204724.75218: results queue empty 51243 1727204724.75219: checking for any_errors_fatal 51243 1727204724.75222: done checking for any_errors_fatal 51243 1727204724.75223: checking for max_fail_percentage 51243 1727204724.75224: done checking for max_fail_percentage 51243 1727204724.75224: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.75225: done checking to see if all hosts have failed 51243 1727204724.75226: getting the remaining hosts for this loop 51243 1727204724.75227: done getting the remaining hosts for this loop 51243 1727204724.75230: getting the next task for host managed-node3 51243 1727204724.75235: done getting next task for host managed-node3 51243 1727204724.75237: ^ task is: TASK: Include the task 'cleanup_mock_wifi.yml' 51243 1727204724.75239: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.75242: getting variables 51243 1727204724.75243: in VariableManager get_vars() 51243 1727204724.75263: Calling all_inventory to load vars for managed-node3 51243 1727204724.75267: Calling groups_inventory to load vars for managed-node3 51243 1727204724.75270: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.75275: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.75277: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.75280: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.75497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.75746: done with get_vars() 51243 1727204724.75757: done getting variables TASK [Include the task 'cleanup_mock_wifi.yml'] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.035) 0:00:07.366 ***** 51243 1727204724.75842: entering _queue_task() for managed-node3/include_tasks 51243 1727204724.76392: worker is 1 (out of 1 available) 51243 1727204724.76405: exiting _queue_task() for managed-node3/include_tasks 51243 1727204724.76417: done queuing things up, now waiting for results queue to drain 51243 1727204724.76419: waiting for pending results... 51243 1727204724.76539: running TaskExecutor() for managed-node3/TASK: Include the task 'cleanup_mock_wifi.yml' 51243 1727204724.76687: in run() - task 127b8e07-fff9-5c5d-847b-000000000142 51243 1727204724.76709: variable 'ansible_search_path' from source: unknown 51243 1727204724.76754: calling self._execute() 51243 1727204724.76856: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.76873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.76890: variable 'omit' from source: magic vars 51243 1727204724.77317: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.77338: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.77459: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.77473: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.77480: when evaluation is False, skipping this task 51243 1727204724.77488: _execute() done 51243 1727204724.77496: dumping result to json 51243 1727204724.77503: done dumping result, returning 51243 1727204724.77513: done running TaskExecutor() for managed-node3/TASK: Include the task 'cleanup_mock_wifi.yml' [127b8e07-fff9-5c5d-847b-000000000142] 51243 1727204724.77527: sending task result for task 127b8e07-fff9-5c5d-847b-000000000142 51243 1727204724.77790: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000142 51243 1727204724.77793: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.77848: no more pending results, returning what we have 51243 1727204724.77853: results queue empty 51243 1727204724.77854: checking for any_errors_fatal 51243 1727204724.77856: done checking for any_errors_fatal 51243 1727204724.77857: checking for max_fail_percentage 51243 1727204724.77858: done checking for max_fail_percentage 51243 1727204724.77859: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.77860: done checking to see if all hosts have failed 51243 1727204724.77861: getting the remaining hosts for this loop 51243 1727204724.77863: done getting the remaining hosts for this loop 51243 1727204724.77870: getting the next task for host managed-node3 51243 1727204724.77878: done getting next task for host managed-node3 51243 1727204724.77881: ^ task is: TASK: Verify network state restored to default 51243 1727204724.77885: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 51243 1727204724.77889: getting variables 51243 1727204724.77891: in VariableManager get_vars() 51243 1727204724.77950: Calling all_inventory to load vars for managed-node3 51243 1727204724.77953: Calling groups_inventory to load vars for managed-node3 51243 1727204724.77956: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.78135: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.78139: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.78143: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.78346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.78561: done with get_vars() 51243 1727204724.78578: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.028) 0:00:07.394 ***** 51243 1727204724.78687: entering _queue_task() for managed-node3/include_tasks 51243 1727204724.79412: worker is 1 (out of 1 available) 51243 1727204724.79425: exiting _queue_task() for managed-node3/include_tasks 51243 1727204724.79437: done queuing things up, now waiting for results queue to drain 51243 1727204724.79439: waiting for pending results... 51243 1727204724.79916: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 51243 1727204724.80023: in run() - task 127b8e07-fff9-5c5d-847b-000000000143 51243 1727204724.80090: variable 'ansible_search_path' from source: unknown 51243 1727204724.80256: calling self._execute() 51243 1727204724.80434: variable 'ansible_host' from source: host vars for 'managed-node3' 51243 1727204724.80449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 51243 1727204724.80484: variable 'omit' from source: magic vars 51243 1727204724.81357: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.81381: Evaluated conditional (ansible_distribution_major_version != '6'): True 51243 1727204724.81672: variable 'ansible_distribution_major_version' from source: facts 51243 1727204724.81932: Evaluated conditional (ansible_distribution_major_version == '7'): False 51243 1727204724.81936: when evaluation is False, skipping this task 51243 1727204724.81940: _execute() done 51243 1727204724.81942: dumping result to json 51243 1727204724.81944: done dumping result, returning 51243 1727204724.81947: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [127b8e07-fff9-5c5d-847b-000000000143] 51243 1727204724.81949: sending task result for task 127b8e07-fff9-5c5d-847b-000000000143 51243 1727204724.82029: done sending task result for task 127b8e07-fff9-5c5d-847b-000000000143 51243 1727204724.82048: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 51243 1727204724.82105: no more pending results, returning what we have 51243 1727204724.82109: results queue empty 51243 1727204724.82111: checking for any_errors_fatal 51243 1727204724.82121: done checking for any_errors_fatal 51243 1727204724.82122: checking for max_fail_percentage 51243 1727204724.82123: done checking for max_fail_percentage 51243 1727204724.82125: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.82125: done checking to see if all hosts have failed 51243 1727204724.82126: getting the remaining hosts for this loop 51243 1727204724.82128: done getting the remaining hosts for this loop 51243 1727204724.82134: getting the next task for host managed-node3 51243 1727204724.82144: done getting next task for host managed-node3 51243 1727204724.82147: ^ task is: TASK: meta (flush_handlers) 51243 1727204724.82150: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204724.82155: getting variables 51243 1727204724.82157: in VariableManager get_vars() 51243 1727204724.82216: Calling all_inventory to load vars for managed-node3 51243 1727204724.82219: Calling groups_inventory to load vars for managed-node3 51243 1727204724.82221: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.82238: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.82241: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.82244: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.83295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.83550: done with get_vars() 51243 1727204724.83785: done getting variables 51243 1727204724.83933: in VariableManager get_vars() 51243 1727204724.83959: Calling all_inventory to load vars for managed-node3 51243 1727204724.83962: Calling groups_inventory to load vars for managed-node3 51243 1727204724.83964: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.84002: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.84005: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.84008: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.84449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.84989: done with get_vars() 51243 1727204724.85009: done queuing things up, now waiting for results queue to drain 51243 1727204724.85012: results queue empty 51243 1727204724.85013: checking for any_errors_fatal 51243 1727204724.85016: done checking for any_errors_fatal 51243 1727204724.85017: checking for max_fail_percentage 51243 1727204724.85018: done checking for max_fail_percentage 51243 1727204724.85019: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.85020: done checking to see if all hosts have failed 51243 1727204724.85021: getting the remaining hosts for this loop 51243 1727204724.85022: done getting the remaining hosts for this loop 51243 1727204724.85025: getting the next task for host managed-node3 51243 1727204724.85029: done getting next task for host managed-node3 51243 1727204724.85031: ^ task is: TASK: meta (flush_handlers) 51243 1727204724.85032: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204724.85035: getting variables 51243 1727204724.85036: in VariableManager get_vars() 51243 1727204724.85057: Calling all_inventory to load vars for managed-node3 51243 1727204724.85060: Calling groups_inventory to load vars for managed-node3 51243 1727204724.85062: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.85072: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.85075: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.85079: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.85249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.85580: done with get_vars() 51243 1727204724.85590: done getting variables 51243 1727204724.85646: in VariableManager get_vars() 51243 1727204724.85664: Calling all_inventory to load vars for managed-node3 51243 1727204724.85669: Calling groups_inventory to load vars for managed-node3 51243 1727204724.85672: Calling all_plugins_inventory to load vars for managed-node3 51243 1727204724.85678: Calling all_plugins_play to load vars for managed-node3 51243 1727204724.85680: Calling groups_plugins_inventory to load vars for managed-node3 51243 1727204724.85683: Calling groups_plugins_play to load vars for managed-node3 51243 1727204724.86159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 51243 1727204724.86501: done with get_vars() 51243 1727204724.86518: done queuing things up, now waiting for results queue to drain 51243 1727204724.86520: results queue empty 51243 1727204724.86521: checking for any_errors_fatal 51243 1727204724.86523: done checking for any_errors_fatal 51243 1727204724.86524: checking for max_fail_percentage 51243 1727204724.86525: done checking for max_fail_percentage 51243 1727204724.86525: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.86526: done checking to see if all hosts have failed 51243 1727204724.86527: getting the remaining hosts for this loop 51243 1727204724.86528: done getting the remaining hosts for this loop 51243 1727204724.86537: getting the next task for host managed-node3 51243 1727204724.86541: done getting next task for host managed-node3 51243 1727204724.86542: ^ task is: None 51243 1727204724.86544: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 51243 1727204724.86545: done queuing things up, now waiting for results queue to drain 51243 1727204724.86546: results queue empty 51243 1727204724.86547: checking for any_errors_fatal 51243 1727204724.86548: done checking for any_errors_fatal 51243 1727204724.86548: checking for max_fail_percentage 51243 1727204724.86549: done checking for max_fail_percentage 51243 1727204724.86550: checking to see if all hosts have failed and the running result is not ok 51243 1727204724.86551: done checking to see if all hosts have failed 51243 1727204724.86553: getting the next task for host managed-node3 51243 1727204724.86555: done getting next task for host managed-node3 51243 1727204724.86556: ^ task is: None 51243 1727204724.86557: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=7 changed=0 unreachable=0 failed=0 skipped=102 rescued=0 ignored=0 Tuesday 24 September 2024 15:05:24 -0400 (0:00:00.082) 0:00:07.477 ***** =============================================================================== Gathering Facts --------------------------------------------------------- 1.69s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Gather the minimum subset of ansible_facts required by the network role test --- 0.80s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.78s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Include the task 'enable_epel.yml' -------------------------------------- 0.12s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Verify network state restored to default -------------------------------- 0.08s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces --- 0.08s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.07s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Set network provider to 'nm' -------------------------------------------- 0.07s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 fedora.linux_system_roles.network : Enable network service -------------- 0.07s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.06s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Enable and start wpa_supplicant ----- 0.06s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces --- 0.06s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 fedora.linux_system_roles.network : Install packages -------------------- 0.06s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable --- 0.06s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 fedora.linux_system_roles.network : Ensure ansible_facts used by role --- 0.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces --- 0.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 TEST: wireless connection with 802.1x TLS-EAP --------------------------- 0.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces --- 0.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 51243 1727204724.86971: RUNNING CLEANUP