34886 1727204481.48132: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 34886 1727204481.48483: Added group all to inventory 34886 1727204481.48484: Added group ungrouped to inventory 34886 1727204481.48488: Group all now contains ungrouped 34886 1727204481.48492: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 34886 1727204481.60827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 34886 1727204481.60880: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 34886 1727204481.60905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 34886 1727204481.60955: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 34886 1727204481.61019: Loaded config def from plugin (inventory/script) 34886 1727204481.61021: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 34886 1727204481.61055: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 34886 1727204481.61129: Loaded config def from plugin (inventory/yaml) 34886 1727204481.61131: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 34886 1727204481.61207: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 34886 1727204481.61564: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 34886 1727204481.61567: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 34886 1727204481.61569: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 34886 1727204481.61574: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 34886 1727204481.61578: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 34886 1727204481.61635: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 34886 1727204481.61690: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 34886 1727204481.61725: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 34886 1727204481.61796: group all already in inventory 34886 1727204481.61802: set inventory_file for managed-node1 34886 1727204481.61806: set inventory_dir for managed-node1 34886 1727204481.61806: Added host managed-node1 to inventory 34886 1727204481.61808: Added host managed-node1 to group all 34886 1727204481.61809: set ansible_host for managed-node1 34886 1727204481.61810: set ansible_ssh_extra_args for managed-node1 34886 1727204481.61812: set inventory_file for managed-node2 34886 1727204481.61814: set inventory_dir for managed-node2 34886 1727204481.61815: Added host managed-node2 to inventory 34886 1727204481.61816: Added host managed-node2 to group all 34886 1727204481.61817: set ansible_host for managed-node2 34886 1727204481.61817: set ansible_ssh_extra_args for managed-node2 34886 1727204481.61819: set inventory_file for managed-node3 34886 1727204481.61822: set inventory_dir for managed-node3 34886 1727204481.61822: Added host managed-node3 to inventory 34886 1727204481.61823: Added host managed-node3 to group all 34886 1727204481.61824: set ansible_host for managed-node3 34886 1727204481.61825: set ansible_ssh_extra_args for managed-node3 34886 1727204481.61827: Reconcile groups and hosts in inventory. 34886 1727204481.61830: Group ungrouped now contains managed-node1 34886 1727204481.61831: Group ungrouped now contains managed-node2 34886 1727204481.61833: Group ungrouped now contains managed-node3 34886 1727204481.61903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 34886 1727204481.62009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 34886 1727204481.62050: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 34886 1727204481.62072: Loaded config def from plugin (vars/host_group_vars) 34886 1727204481.62075: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 34886 1727204481.62081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 34886 1727204481.62091: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 34886 1727204481.62129: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 34886 1727204481.62406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204481.62483: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 34886 1727204481.62519: Loaded config def from plugin (connection/local) 34886 1727204481.62522: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 34886 1727204481.63051: Loaded config def from plugin (connection/paramiko_ssh) 34886 1727204481.63054: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 34886 1727204481.63791: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34886 1727204481.63826: Loaded config def from plugin (connection/psrp) 34886 1727204481.63829: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 34886 1727204481.64438: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34886 1727204481.64475: Loaded config def from plugin (connection/ssh) 34886 1727204481.64478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 34886 1727204481.66123: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34886 1727204481.66186: Loaded config def from plugin (connection/winrm) 34886 1727204481.66191: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 34886 1727204481.66229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 34886 1727204481.66303: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 34886 1727204481.66402: Loaded config def from plugin (shell/cmd) 34886 1727204481.66405: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 34886 1727204481.66440: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 34886 1727204481.66538: Loaded config def from plugin (shell/powershell) 34886 1727204481.66541: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 34886 1727204481.66609: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 34886 1727204481.66867: Loaded config def from plugin (shell/sh) 34886 1727204481.66869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 34886 1727204481.66912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 34886 1727204481.67092: Loaded config def from plugin (become/runas) 34886 1727204481.67096: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 34886 1727204481.67315: Loaded config def from plugin (become/su) 34886 1727204481.67317: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 34886 1727204481.67463: Loaded config def from plugin (become/sudo) 34886 1727204481.67465: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 34886 1727204481.67496: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 34886 1727204481.67772: in VariableManager get_vars() 34886 1727204481.67791: done with get_vars() 34886 1727204481.67903: trying /usr/local/lib/python3.12/site-packages/ansible/modules 34886 1727204481.70898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 34886 1727204481.71059: in VariableManager get_vars() 34886 1727204481.71065: done with get_vars() 34886 1727204481.71068: variable 'playbook_dir' from source: magic vars 34886 1727204481.71070: variable 'ansible_playbook_python' from source: magic vars 34886 1727204481.71071: variable 'ansible_config_file' from source: magic vars 34886 1727204481.71072: variable 'groups' from source: magic vars 34886 1727204481.71073: variable 'omit' from source: magic vars 34886 1727204481.71074: variable 'ansible_version' from source: magic vars 34886 1727204481.71075: variable 'ansible_check_mode' from source: magic vars 34886 1727204481.71076: variable 'ansible_diff_mode' from source: magic vars 34886 1727204481.71077: variable 'ansible_forks' from source: magic vars 34886 1727204481.71078: variable 'ansible_inventory_sources' from source: magic vars 34886 1727204481.71079: variable 'ansible_skip_tags' from source: magic vars 34886 1727204481.71080: variable 'ansible_limit' from source: magic vars 34886 1727204481.71081: variable 'ansible_run_tags' from source: magic vars 34886 1727204481.71082: variable 'ansible_verbosity' from source: magic vars 34886 1727204481.71132: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 34886 1727204481.71894: in VariableManager get_vars() 34886 1727204481.71916: done with get_vars() 34886 1727204481.71968: in VariableManager get_vars() 34886 1727204481.71986: done with get_vars() 34886 1727204481.72338: in VariableManager get_vars() 34886 1727204481.72356: done with get_vars() 34886 1727204481.72363: variable 'omit' from source: magic vars 34886 1727204481.72386: variable 'omit' from source: magic vars 34886 1727204481.72434: in VariableManager get_vars() 34886 1727204481.72447: done with get_vars() 34886 1727204481.72506: in VariableManager get_vars() 34886 1727204481.72524: done with get_vars() 34886 1727204481.72570: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34886 1727204481.72872: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34886 1727204481.72981: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34886 1727204481.73527: in VariableManager get_vars() 34886 1727204481.73542: done with get_vars() 34886 1727204481.73893: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 34886 1727204481.74010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34886 1727204481.75782: in VariableManager get_vars() 34886 1727204481.75798: done with get_vars() 34886 1727204481.75830: in VariableManager get_vars() 34886 1727204481.75859: done with get_vars() 34886 1727204481.76444: in VariableManager get_vars() 34886 1727204481.76457: done with get_vars() 34886 1727204481.76461: variable 'omit' from source: magic vars 34886 1727204481.76469: variable 'omit' from source: magic vars 34886 1727204481.76495: in VariableManager get_vars() 34886 1727204481.76506: done with get_vars() 34886 1727204481.76525: in VariableManager get_vars() 34886 1727204481.76539: done with get_vars() 34886 1727204481.76561: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34886 1727204481.76653: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34886 1727204481.76793: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34886 1727204481.78473: in VariableManager get_vars() 34886 1727204481.78505: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34886 1727204481.80384: in VariableManager get_vars() 34886 1727204481.80403: done with get_vars() 34886 1727204481.80511: in VariableManager get_vars() 34886 1727204481.80528: done with get_vars() 34886 1727204481.80574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 34886 1727204481.80585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 34886 1727204481.80787: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 34886 1727204481.81002: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 34886 1727204481.81009: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 34886 1727204481.81053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 34886 1727204481.81086: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 34886 1727204481.81339: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 34886 1727204481.81428: Loaded config def from plugin (callback/default) 34886 1727204481.81432: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34886 1727204481.82952: Loaded config def from plugin (callback/junit) 34886 1727204481.82956: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34886 1727204481.83022: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 34886 1727204481.83079: Loaded config def from plugin (callback/minimal) 34886 1727204481.83081: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34886 1727204481.83116: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34886 1727204481.83176: Loaded config def from plugin (callback/tree) 34886 1727204481.83179: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 34886 1727204481.83284: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 34886 1727204481.83287: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 34886 1727204481.83311: in VariableManager get_vars() 34886 1727204481.83323: done with get_vars() 34886 1727204481.83327: in VariableManager get_vars() 34886 1727204481.83334: done with get_vars() 34886 1727204481.83337: variable 'omit' from source: magic vars 34886 1727204481.83367: in VariableManager get_vars() 34886 1727204481.83377: done with get_vars() 34886 1727204481.83400: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 34886 1727204481.83857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 34886 1727204481.83918: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 34886 1727204481.83948: getting the remaining hosts for this loop 34886 1727204481.83950: done getting the remaining hosts for this loop 34886 1727204481.83953: getting the next task for host managed-node3 34886 1727204481.83956: done getting next task for host managed-node3 34886 1727204481.83958: ^ task is: TASK: Gathering Facts 34886 1727204481.83959: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204481.83961: getting variables 34886 1727204481.83962: in VariableManager get_vars() 34886 1727204481.83970: Calling all_inventory to load vars for managed-node3 34886 1727204481.83972: Calling groups_inventory to load vars for managed-node3 34886 1727204481.83974: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204481.83984: Calling all_plugins_play to load vars for managed-node3 34886 1727204481.83995: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204481.83998: Calling groups_plugins_play to load vars for managed-node3 34886 1727204481.84028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204481.84073: done with get_vars() 34886 1727204481.84078: done getting variables 34886 1727204481.84142: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Tuesday 24 September 2024 15:01:21 -0400 (0:00:00.009) 0:00:00.009 ***** 34886 1727204481.84163: entering _queue_task() for managed-node3/gather_facts 34886 1727204481.84164: Creating lock for gather_facts 34886 1727204481.84474: worker is 1 (out of 1 available) 34886 1727204481.84486: exiting _queue_task() for managed-node3/gather_facts 34886 1727204481.84503: done queuing things up, now waiting for results queue to drain 34886 1727204481.84506: waiting for pending results... 34886 1727204481.84659: running TaskExecutor() for managed-node3/TASK: Gathering Facts 34886 1727204481.84722: in run() - task 12b410aa-8751-04b9-2e74-0000000000b9 34886 1727204481.84740: variable 'ansible_search_path' from source: unknown 34886 1727204481.84773: calling self._execute() 34886 1727204481.84826: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204481.84833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204481.84848: variable 'omit' from source: magic vars 34886 1727204481.84955: variable 'omit' from source: magic vars 34886 1727204481.85110: variable 'omit' from source: magic vars 34886 1727204481.85118: variable 'omit' from source: magic vars 34886 1727204481.85121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204481.85196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204481.85199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204481.85202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204481.85205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204481.85245: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204481.85256: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204481.85266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204481.85402: Set connection var ansible_timeout to 10 34886 1727204481.85417: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204481.85430: Set connection var ansible_connection to ssh 34886 1727204481.85442: Set connection var ansible_shell_executable to /bin/sh 34886 1727204481.85456: Set connection var ansible_pipelining to False 34886 1727204481.85463: Set connection var ansible_shell_type to sh 34886 1727204481.85497: variable 'ansible_shell_executable' from source: unknown 34886 1727204481.85506: variable 'ansible_connection' from source: unknown 34886 1727204481.85594: variable 'ansible_module_compression' from source: unknown 34886 1727204481.85598: variable 'ansible_shell_type' from source: unknown 34886 1727204481.85600: variable 'ansible_shell_executable' from source: unknown 34886 1727204481.85603: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204481.85605: variable 'ansible_pipelining' from source: unknown 34886 1727204481.85607: variable 'ansible_timeout' from source: unknown 34886 1727204481.85609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204481.85835: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204481.85856: variable 'omit' from source: magic vars 34886 1727204481.85871: starting attempt loop 34886 1727204481.85884: running the handler 34886 1727204481.85933: variable 'ansible_facts' from source: unknown 34886 1727204481.85937: _low_level_execute_command(): starting 34886 1727204481.85940: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204481.86465: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204481.86470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204481.86473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204481.86529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204481.86537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204481.86581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204481.88331: stdout chunk (state=3): >>>/root <<< 34886 1727204481.88442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204481.88492: stderr chunk (state=3): >>><<< 34886 1727204481.88496: stdout chunk (state=3): >>><<< 34886 1727204481.88518: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204481.88531: _low_level_execute_command(): starting 34886 1727204481.88538: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466 `" && echo ansible-tmp-1727204481.885179-34925-29945639115466="` echo /root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466 `" ) && sleep 0' 34886 1727204481.88954: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204481.88987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204481.88992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204481.88995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204481.88997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204481.89052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204481.89056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204481.89102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204481.91062: stdout chunk (state=3): >>>ansible-tmp-1727204481.885179-34925-29945639115466=/root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466 <<< 34886 1727204481.91180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204481.91234: stderr chunk (state=3): >>><<< 34886 1727204481.91238: stdout chunk (state=3): >>><<< 34886 1727204481.91260: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204481.885179-34925-29945639115466=/root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204481.91292: variable 'ansible_module_compression' from source: unknown 34886 1727204481.91339: ANSIBALLZ: Using generic lock for ansible.legacy.setup 34886 1727204481.91342: ANSIBALLZ: Acquiring lock 34886 1727204481.91345: ANSIBALLZ: Lock acquired: 139734986903328 34886 1727204481.91347: ANSIBALLZ: Creating module 34886 1727204482.17732: ANSIBALLZ: Writing module into payload 34886 1727204482.18092: ANSIBALLZ: Writing module 34886 1727204482.18096: ANSIBALLZ: Renaming module 34886 1727204482.18099: ANSIBALLZ: Done creating module 34886 1727204482.18101: variable 'ansible_facts' from source: unknown 34886 1727204482.18103: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204482.18105: _low_level_execute_command(): starting 34886 1727204482.18107: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 34886 1727204482.18640: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204482.18655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204482.18672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204482.18698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204482.18717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204482.18734: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204482.18750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204482.18771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34886 1727204482.18784: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204482.18807: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34886 1727204482.18901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204482.18916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204482.18999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204482.20772: stdout chunk (state=3): >>>PLATFORM <<< 34886 1727204482.20849: stdout chunk (state=3): >>>Linux <<< 34886 1727204482.20879: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 34886 1727204482.21105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204482.21118: stdout chunk (state=3): >>><<< 34886 1727204482.21134: stderr chunk (state=3): >>><<< 34886 1727204482.21157: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204482.21177 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 34886 1727204482.21246: _low_level_execute_command(): starting 34886 1727204482.21257: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 34886 1727204482.21625: Sending initial data 34886 1727204482.21629: Sent initial data (1181 bytes) 34886 1727204482.21913: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204482.21932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204482.21956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204482.21977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204482.21996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204482.22009: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204482.22027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204482.22046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34886 1727204482.22071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204482.22083: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34886 1727204482.22104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204482.22183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204482.22215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204482.22237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204482.22268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204482.22336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204482.26046: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 34886 1727204482.26596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204482.26635: stderr chunk (state=3): >>><<< 34886 1727204482.26795: stdout chunk (state=3): >>><<< 34886 1727204482.26799: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204482.26802: variable 'ansible_facts' from source: unknown 34886 1727204482.26804: variable 'ansible_facts' from source: unknown 34886 1727204482.26816: variable 'ansible_module_compression' from source: unknown 34886 1727204482.26867: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34886 1727204482.26906: variable 'ansible_facts' from source: unknown 34886 1727204482.27120: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/AnsiballZ_setup.py 34886 1727204482.27382: Sending initial data 34886 1727204482.27386: Sent initial data (152 bytes) 34886 1727204482.27938: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204482.28005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204482.28072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204482.28094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204482.28144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204482.28185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204482.29851: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204482.29905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204482.29971: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpychay3qg /root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/AnsiballZ_setup.py <<< 34886 1727204482.30006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpychay3qg" to remote "/root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/AnsiballZ_setup.py" <<< 34886 1727204482.31720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204482.31756: stderr chunk (state=3): >>><<< 34886 1727204482.31765: stdout chunk (state=3): >>><<< 34886 1727204482.31799: done transferring module to remote 34886 1727204482.31823: _low_level_execute_command(): starting 34886 1727204482.31861: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/ /root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/AnsiballZ_setup.py && sleep 0' 34886 1727204482.32283: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204482.32327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204482.32331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204482.32334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204482.32336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204482.32338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204482.32385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204482.32392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204482.32435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204482.34283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204482.34336: stderr chunk (state=3): >>><<< 34886 1727204482.34340: stdout chunk (state=3): >>><<< 34886 1727204482.34356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204482.34359: _low_level_execute_command(): starting 34886 1727204482.34372: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/AnsiballZ_setup.py && sleep 0' 34886 1727204482.34834: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204482.34837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204482.34840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204482.34843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204482.34892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204482.34899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204482.34943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204482.37129: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34886 1727204482.37165: stdout chunk (state=3): >>>import _imp # builtin <<< 34886 1727204482.37205: stdout chunk (state=3): >>>import '_thread' # <<< 34886 1727204482.37208: stdout chunk (state=3): >>>import '_warnings' # <<< 34886 1727204482.37220: stdout chunk (state=3): >>>import '_weakref' # <<< 34886 1727204482.37278: stdout chunk (state=3): >>>import '_io' # <<< 34886 1727204482.37284: stdout chunk (state=3): >>>import 'marshal' # <<< 34886 1727204482.37321: stdout chunk (state=3): >>>import 'posix' # <<< 34886 1727204482.37356: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34886 1727204482.37386: stdout chunk (state=3): >>>import 'time' # <<< 34886 1727204482.37401: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 34886 1727204482.37454: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 34886 1727204482.37459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.37474: stdout chunk (state=3): >>>import '_codecs' # <<< 34886 1727204482.37510: stdout chunk (state=3): >>>import 'codecs' # <<< 34886 1727204482.37537: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34886 1727204482.37575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 34886 1727204482.37578: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cd2c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ccfbad0> <<< 34886 1727204482.37612: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 34886 1727204482.37636: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cd2ea20> <<< 34886 1727204482.37646: stdout chunk (state=3): >>>import '_signal' # <<< 34886 1727204482.37672: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 34886 1727204482.37702: stdout chunk (state=3): >>>import 'io' # <<< 34886 1727204482.37735: stdout chunk (state=3): >>>import '_stat' # <<< 34886 1727204482.37739: stdout chunk (state=3): >>>import 'stat' # <<< 34886 1727204482.37831: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34886 1727204482.37862: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 34886 1727204482.37894: stdout chunk (state=3): >>>import 'os' # <<< 34886 1727204482.37907: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 34886 1727204482.37926: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 34886 1727204482.37948: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 34886 1727204482.37962: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 34886 1727204482.37969: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34886 1727204482.37995: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 34886 1727204482.38001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34886 1727204482.38025: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb1d0a0> <<< 34886 1727204482.38088: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34886 1727204482.38104: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.38110: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb1dfd0> <<< 34886 1727204482.38138: stdout chunk (state=3): >>>import 'site' # <<< 34886 1727204482.38170: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34886 1727204482.38584: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34886 1727204482.38597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34886 1727204482.38626: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34886 1727204482.38633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.38655: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34886 1727204482.38695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34886 1727204482.38714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34886 1727204482.38744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34886 1727204482.38766: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb5bdd0> <<< 34886 1727204482.38784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34886 1727204482.38819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34886 1727204482.38822: stdout chunk (state=3): >>>import '_operator' # <<< 34886 1727204482.38833: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb5bfe0> <<< 34886 1727204482.38853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34886 1727204482.38881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34886 1727204482.38903: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34886 1727204482.38961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.38975: stdout chunk (state=3): >>>import 'itertools' # <<< 34886 1727204482.39005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb93800> <<< 34886 1727204482.39041: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 34886 1727204482.39056: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb93e90> <<< 34886 1727204482.39067: stdout chunk (state=3): >>>import '_collections' # <<< 34886 1727204482.39117: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb73aa0> <<< 34886 1727204482.39129: stdout chunk (state=3): >>>import '_functools' # <<< 34886 1727204482.39150: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb711c0> <<< 34886 1727204482.39248: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb58f80> <<< 34886 1727204482.39276: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34886 1727204482.39314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34886 1727204482.39322: stdout chunk (state=3): >>>import '_sre' # <<< 34886 1727204482.39340: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34886 1727204482.39358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34886 1727204482.39393: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 34886 1727204482.39395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34886 1727204482.39437: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbb76e0> <<< 34886 1727204482.39440: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbb6300> <<< 34886 1727204482.39476: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb721b0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbb4bf0> <<< 34886 1727204482.39542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34886 1727204482.39569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbe8710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb58200> <<< 34886 1727204482.39582: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34886 1727204482.39615: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.39630: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cbe8bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbe8a70> <<< 34886 1727204482.39663: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.39675: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cbe8e60> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb56d20> <<< 34886 1727204482.39708: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.39731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34886 1727204482.39773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34886 1727204482.39792: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbe9520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbe91f0> import 'importlib.machinery' # <<< 34886 1727204482.39826: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 34886 1727204482.39855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbea420> <<< 34886 1727204482.39876: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 34886 1727204482.39895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34886 1727204482.39964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34886 1727204482.39968: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 34886 1727204482.39970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cc04650> <<< 34886 1727204482.40023: stdout chunk (state=3): >>>import 'errno' # <<< 34886 1727204482.40026: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.40048: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cc05d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34886 1727204482.40079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 34886 1727204482.40100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cc06c90> <<< 34886 1727204482.40145: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cc072f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cc061e0> <<< 34886 1727204482.40178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34886 1727204482.40181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34886 1727204482.40234: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.40249: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cc07d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cc074a0> <<< 34886 1727204482.40288: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbea480> <<< 34886 1727204482.40313: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34886 1727204482.40337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34886 1727204482.40360: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34886 1727204482.40378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34886 1727204482.40422: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.40445: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c903ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34886 1727204482.40481: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.40533: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c92c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c92c470> <<< 34886 1727204482.40544: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c92c740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.40561: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c92c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c901e80> <<< 34886 1727204482.40581: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34886 1727204482.40684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34886 1727204482.40715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34886 1727204482.40731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34886 1727204482.40737: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c92df70> <<< 34886 1727204482.40756: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c92cbf0> <<< 34886 1727204482.40779: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbeab70> <<< 34886 1727204482.40804: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34886 1727204482.40859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.40878: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34886 1727204482.40926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34886 1727204482.40956: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c95a300> <<< 34886 1727204482.41011: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34886 1727204482.41024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.41047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34886 1727204482.41064: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34886 1727204482.41120: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9724b0> <<< 34886 1727204482.41138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34886 1727204482.41181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34886 1727204482.41239: stdout chunk (state=3): >>>import 'ntpath' # <<< 34886 1727204482.41267: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9af260> <<< 34886 1727204482.41292: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34886 1727204482.41328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34886 1727204482.41357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34886 1727204482.41398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34886 1727204482.41493: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9d1a00> <<< 34886 1727204482.41566: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9af380> <<< 34886 1727204482.41615: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c973140> <<< 34886 1727204482.41641: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 34886 1727204482.41656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c7ec3e0> <<< 34886 1727204482.41669: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9714f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c92eea0> <<< 34886 1727204482.41836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34886 1727204482.41860: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcf0c7ec680> <<< 34886 1727204482.42036: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_t4mrqe4l/ansible_ansible.legacy.setup_payload.zip' <<< 34886 1727204482.42043: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.42200: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.42230: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34886 1727204482.42237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34886 1727204482.42284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34886 1727204482.42359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34886 1727204482.42390: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c8560f0> <<< 34886 1727204482.42405: stdout chunk (state=3): >>>import '_typing' # <<< 34886 1727204482.42600: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c82cfe0> <<< 34886 1727204482.42608: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c82c140> # zipimport: zlib available <<< 34886 1727204482.42640: stdout chunk (state=3): >>>import 'ansible' # <<< 34886 1727204482.42653: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.42674: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.42690: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.42704: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 34886 1727204482.42714: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.44286: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.45590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 34886 1727204482.45601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c82fe00> <<< 34886 1727204482.45623: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.45655: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34886 1727204482.45662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34886 1727204482.45683: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34886 1727204482.45716: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.45736: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.45742: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c889ac0> <<< 34886 1727204482.45765: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c889850> <<< 34886 1727204482.45804: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c889160> <<< 34886 1727204482.45822: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34886 1727204482.45834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34886 1727204482.45876: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c8898b0> <<< 34886 1727204482.45879: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c856b70> <<< 34886 1727204482.45885: stdout chunk (state=3): >>>import 'atexit' # <<< 34886 1727204482.45912: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c88a840> <<< 34886 1727204482.45941: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.45954: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c88aa50> <<< 34886 1727204482.45972: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34886 1727204482.46015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34886 1727204482.46033: stdout chunk (state=3): >>>import '_locale' # <<< 34886 1727204482.46078: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c88af30> <<< 34886 1727204482.46102: stdout chunk (state=3): >>>import 'pwd' # <<< 34886 1727204482.46115: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34886 1727204482.46145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34886 1727204482.46181: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6ecd40> <<< 34886 1727204482.46211: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c6ee960> <<< 34886 1727204482.46239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34886 1727204482.46252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34886 1727204482.46298: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6ef260> <<< 34886 1727204482.46313: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34886 1727204482.46340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34886 1727204482.46366: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f0440> <<< 34886 1727204482.46393: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34886 1727204482.46429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34886 1727204482.46453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34886 1727204482.46520: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f2ed0> <<< 34886 1727204482.46560: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.46566: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c6f2fc0> <<< 34886 1727204482.46585: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f10d0> <<< 34886 1727204482.46610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34886 1727204482.46637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34886 1727204482.46661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34886 1727204482.46687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34886 1727204482.46716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34886 1727204482.46741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 34886 1727204482.46767: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f6cc0> <<< 34886 1727204482.46774: stdout chunk (state=3): >>>import '_tokenize' # <<< 34886 1727204482.46845: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f57c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f5520> <<< 34886 1727204482.46876: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34886 1727204482.46881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34886 1727204482.46964: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f7ef0> <<< 34886 1727204482.46994: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f16a0> <<< 34886 1727204482.47024: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c73ad80> <<< 34886 1727204482.47056: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 34886 1727204482.47062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c73aed0> <<< 34886 1727204482.47082: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34886 1727204482.47107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34886 1727204482.47127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34886 1727204482.47166: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.47173: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c740ad0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c740890> <<< 34886 1727204482.47196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34886 1727204482.47320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34886 1727204482.47377: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c743080> <<< 34886 1727204482.47384: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c7411c0> <<< 34886 1727204482.47412: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34886 1727204482.47460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.47482: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34886 1727204482.47503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34886 1727204482.47563: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c74a840> <<< 34886 1727204482.47719: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c7431d0> <<< 34886 1727204482.47804: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74bb60> <<< 34886 1727204482.47842: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74b530> <<< 34886 1727204482.47895: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74b9b0> <<< 34886 1727204482.47921: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c73b200> <<< 34886 1727204482.47943: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 34886 1727204482.47948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34886 1727204482.47967: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34886 1727204482.47998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34886 1727204482.48030: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.48062: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74e990> <<< 34886 1727204482.48248: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.48270: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74fe00> <<< 34886 1727204482.48281: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c74d130> <<< 34886 1727204482.48316: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74e4e0> <<< 34886 1727204482.48327: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c74cd10> <<< 34886 1727204482.48336: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.48360: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 34886 1727204482.48383: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.48484: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.48600: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.48612: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.48622: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 34886 1727204482.48633: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.48649: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34886 1727204482.48673: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.48810: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.48957: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.49647: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.50334: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34886 1727204482.50350: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 34886 1727204482.50358: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 34886 1727204482.50385: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34886 1727204482.50404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.50461: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c5d9550> <<< 34886 1727204482.50572: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34886 1727204482.50610: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5d8e90> <<< 34886 1727204482.50617: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f5a30> <<< 34886 1727204482.50672: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34886 1727204482.50685: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.50712: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.50733: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 34886 1727204482.50739: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.50915: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.51104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34886 1727204482.51130: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c753dd0> <<< 34886 1727204482.51136: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.51713: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52269: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52362: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52447: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34886 1727204482.52467: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52509: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52549: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34886 1727204482.52571: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52656: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52769: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34886 1727204482.52793: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52797: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52818: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 34886 1727204482.52826: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52871: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.52924: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34886 1727204482.52931: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.53201: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.53489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34886 1727204482.53560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34886 1727204482.53577: stdout chunk (state=3): >>>import '_ast' # <<< 34886 1727204482.53663: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5db140> <<< 34886 1727204482.53682: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.53763: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.53855: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 34886 1727204482.53866: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 34886 1727204482.53873: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34886 1727204482.53904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34886 1727204482.53994: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.54121: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c5e1940> <<< 34886 1727204482.54182: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c5e22d0> <<< 34886 1727204482.54196: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c753020> <<< 34886 1727204482.54209: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.54257: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.54303: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34886 1727204482.54310: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.54356: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.54409: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.54467: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.54547: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34886 1727204482.54594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.54688: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c5e1040> <<< 34886 1727204482.54738: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5e2480> <<< 34886 1727204482.54776: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 34886 1727204482.54782: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34886 1727204482.54855: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.54924: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.54954: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55003: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 34886 1727204482.55011: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.55027: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34886 1727204482.55054: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34886 1727204482.55071: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34886 1727204482.55139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34886 1727204482.55156: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34886 1727204482.55177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34886 1727204482.55235: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c67a810> <<< 34886 1727204482.55285: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5ec440> <<< 34886 1727204482.55373: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5eb920> <<< 34886 1727204482.55376: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5ea480> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34886 1727204482.55396: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55419: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55448: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34886 1727204482.55513: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34886 1727204482.55532: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55536: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 34886 1727204482.55562: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55626: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55699: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55712: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55738: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55780: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55828: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55865: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.55910: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 34886 1727204482.55914: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.56005: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.56078: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.56107: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.56140: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 34886 1727204482.56155: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.56350: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.56549: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.56592: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.56648: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.56675: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 34886 1727204482.56692: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 34886 1727204482.56713: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 34886 1727204482.56735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34886 1727204482.56770: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c680ec0> <<< 34886 1727204482.56796: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34886 1727204482.56812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 34886 1727204482.56820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34886 1727204482.56867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34886 1727204482.56887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 34886 1727204482.56913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 34886 1727204482.56919: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4bef0> <<< 34886 1727204482.56949: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.56967: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bb4c260> <<< 34886 1727204482.57022: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5ccfb0> <<< 34886 1727204482.57042: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5cc560> <<< 34886 1727204482.57070: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c682db0> <<< 34886 1727204482.57092: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c682840> <<< 34886 1727204482.57105: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34886 1727204482.57160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34886 1727204482.57183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34886 1727204482.57197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 34886 1727204482.57217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 34886 1727204482.57224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 34886 1727204482.57258: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.57265: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bb4f200> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4eab0> <<< 34886 1727204482.57297: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bb4ec90> <<< 34886 1727204482.57317: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4df40> <<< 34886 1727204482.57334: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34886 1727204482.57462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 34886 1727204482.57469: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4f350> <<< 34886 1727204482.57500: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34886 1727204482.57528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34886 1727204482.57560: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.57567: stdout chunk (state=3): >>>import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bbb9e80> <<< 34886 1727204482.57598: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4fe60> <<< 34886 1727204482.57622: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c682a50> import 'ansible.module_utils.facts.timeout' # <<< 34886 1727204482.57644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 34886 1727204482.57663: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.57679: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 34886 1727204482.57702: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.57764: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.57827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34886 1727204482.57842: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.57900: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.57953: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34886 1727204482.57972: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.57985: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.57998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 34886 1727204482.58006: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58038: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 34886 1727204482.58076: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58135: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 34886 1727204482.58202: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58243: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34886 1727204482.58302: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58366: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58427: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58494: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.58557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 34886 1727204482.58568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 34886 1727204482.58574: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.59140: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.59650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 34886 1727204482.59666: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.59730: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.59785: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.59817: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.59854: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 34886 1727204482.59872: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.59902: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.59934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 34886 1727204482.59942: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60006: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34886 1727204482.60082: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60113: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 34886 1727204482.60155: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60188: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60216: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 34886 1727204482.60230: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60311: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60407: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34886 1727204482.60439: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bbbbbc0> <<< 34886 1727204482.60460: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34886 1727204482.60498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34886 1727204482.60622: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bbbab70> <<< 34886 1727204482.60633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 34886 1727204482.60639: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60710: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60774: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 34886 1727204482.60795: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60887: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.60985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 34886 1727204482.60994: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.61065: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.61143: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 34886 1727204482.61157: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.61199: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.61253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34886 1727204482.61299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34886 1727204482.61374: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.61435: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bbe6180> <<< 34886 1727204482.61641: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bbd4530> <<< 34886 1727204482.61649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 34886 1727204482.61665: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.61723: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.61782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34886 1727204482.61795: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.61882: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.61976: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.62107: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.62303: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 34886 1727204482.62325: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.62372: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 34886 1727204482.62376: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.62423: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.62530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 34886 1727204482.62711: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0ba01c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bbd7290> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204482.62715: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34886 1727204482.62849: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.63041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 34886 1727204482.63222: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.63459: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.63463: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204482.63551: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.63721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 34886 1727204482.63846: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.64065: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34886 1727204482.64085: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204482.64796: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.65339: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 34886 1727204482.65448: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.65554: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34886 1727204482.65569: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.65668: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.65923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 34886 1727204482.65953: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.66251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 34886 1727204482.66255: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.66296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 34886 1727204482.66473: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.66506: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.66725: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204482.67040: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 34886 1727204482.67046: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67071: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34886 1727204482.67182: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67257: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34886 1727204482.67273: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67293: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67321: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 34886 1727204482.67329: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67395: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 34886 1727204482.67465: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67525: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 34886 1727204482.67619: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.67892: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68186: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 34886 1727204482.68196: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68257: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68330: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 34886 1727204482.68332: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68370: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68409: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 34886 1727204482.68413: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68449: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68490: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 34886 1727204482.68497: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68530: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68571: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 34886 1727204482.68578: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68665: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68748: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 34886 1727204482.68770: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68777: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 34886 1727204482.68801: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68841: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 34886 1727204482.68898: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68925: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.68950: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.69256: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 34886 1727204482.69274: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.69329: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 34886 1727204482.69342: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.69556: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.69774: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34886 1727204482.69784: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.69834: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.69882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 34886 1727204482.69898: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.69941: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.70194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 34886 1727204482.70198: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.70200: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 34886 1727204482.70296: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204482.70394: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 34886 1727204482.70607: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 34886 1727204482.70911: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 34886 1727204482.70931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34886 1727204482.70937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 34886 1727204482.70962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34886 1727204482.70997: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204482.71008: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0ba2b4d0> <<< 34886 1727204482.71014: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba29d60> <<< 34886 1727204482.71064: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba2a090> <<< 34886 1727204482.85577: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 34886 1727204482.85606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 34886 1727204482.85610: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba70710> <<< 34886 1727204482.85645: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 34886 1727204482.85660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 34886 1727204482.85687: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba71760> <<< 34886 1727204482.85734: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 34886 1727204482.85754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204482.85778: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 34886 1727204482.85787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 34886 1727204482.85815: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bac7740> <<< 34886 1727204482.85835: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba735f0> <<< 34886 1727204482.86096: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 34886 1727204483.10207: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_R<<< 34886 1727204483.10242: stdout chunk (state=3): >>>OLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "", "ansible_local": {}, "ansible_loadavg": {"1m": 0.6171875, "5m": 0.56298828125, "15m": 0.37109375}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "22", "epoch": "1727204482", "epoch_int": "1727204482", "date": "2024-09-24", "time": "15:01:22", "iso8601_micro": "2024-09-24T19:01:22.720092Z", "iso8601": "2024-09-24T19:01:22Z", "iso8601_basic": "20240924T150122720092", "iso8601_basic_short": "20240924T150122", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2848, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 869, "free": 2848}, "nocache": {"free": 3468, "used": 249}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3<<< 34886 1727204483.10271: stdout chunk (state=3): >>>.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 987, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251152920576, "block_size": 4096, "block_total": 64479564, "block_available": 61316631, "block_used": 3162933, "inode_total": 16384000, "inode_available": 16302322, "inode_used": 81678, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentati<<< 34886 1727204483.10282: stdout chunk (state=3): >>>on": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34886 1727204483.10902: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 34886 1727204483.10953: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path<<< 34886 1727204483.10961: stdout chunk (state=3): >>> # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 34886 1727204483.10996: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external <<< 34886 1727204483.11042: stdout chunk (state=3): >>># cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale <<< 34886 1727204483.11062: stdout chunk (state=3): >>># cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 <<< 34886 1727204483.11108: stdout chunk (state=3): >>># cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters <<< 34886 1727204483.11112: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale <<< 34886 1727204483.11154: stdout chunk (state=3): >>># destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool <<< 34886 1727204483.11217: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux <<< 34886 1727204483.11242: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python <<< 34886 1727204483.11293: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 34886 1727204483.11714: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34886 1727204483.11762: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 34886 1727204483.11775: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34886 1727204483.11846: stdout chunk (state=3): >>># destroy ntpath <<< 34886 1727204483.11850: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal <<< 34886 1727204483.11903: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 34886 1727204483.12047: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 34886 1727204483.12054: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 34886 1727204483.12131: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 34886 1727204483.12134: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 34886 1727204483.12247: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob <<< 34886 1727204483.12256: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 34886 1727204483.12439: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 34886 1727204483.12448: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum <<< 34886 1727204483.12500: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath <<< 34886 1727204483.12549: stdout chunk (state=3): >>># cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34886 1727204483.12746: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 34886 1727204483.12753: stdout chunk (state=3): >>># destroy _collections <<< 34886 1727204483.12834: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34886 1727204483.12838: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34886 1727204483.12871: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34886 1727204483.12875: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 34886 1727204483.12924: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34886 1727204483.13083: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34886 1727204483.13100: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 34886 1727204483.13161: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34886 1727204483.13767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204483.14134: stderr chunk (state=3): >>><<< 34886 1727204483.14138: stdout chunk (state=3): >>><<< 34886 1727204483.14174: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cd2c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ccfbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cd2ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb1d0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb1dfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb5bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb5bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb93800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb93e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb73aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb711c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb58f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbb76e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbb6300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb721b0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbb4bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbe8710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb58200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cbe8bc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbe8a70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cbe8e60> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cb56d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbe9520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbe91f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbea420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cc04650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cc05d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cc06c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cc072f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cc061e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0cc07d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cc074a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbea480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c903ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c92c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c92c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c92c740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c92c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c901e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c92df70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c92cbf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0cbeab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c95a300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9724b0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9af260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9d1a00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9af380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c973140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c7ec3e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c9714f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c92eea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcf0c7ec680> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_t4mrqe4l/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c8560f0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c82cfe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c82c140> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c82fe00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c889ac0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c889850> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c889160> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c8898b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c856b70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c88a840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c88aa50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c88af30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6ecd40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c6ee960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6ef260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f0440> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f2ed0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c6f2fc0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f10d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f6cc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f57c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f5520> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f7ef0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f16a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c73ad80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c73aed0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c740ad0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c740890> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c743080> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c7411c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c74a840> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c7431d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74bb60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74b530> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74b9b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c73b200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74e990> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74fe00> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c74d130> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c74e4e0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c74cd10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c5d9550> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5d8e90> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c6f5a30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c753dd0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5db140> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c5e1940> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c5e22d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c753020> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0c5e1040> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5e2480> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c67a810> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5ec440> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5eb920> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5ea480> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c680ec0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4bef0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bb4c260> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5ccfb0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c5cc560> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c682db0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c682840> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bb4f200> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4eab0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bb4ec90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4df40> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4f350> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bbb9e80> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bb4fe60> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0c682a50> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bbbbbc0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bbbab70> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0bbe6180> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bbd4530> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0ba01c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bbd7290> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf0ba2b4d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba29d60> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba2a090> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba70710> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba71760> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0bac7740> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf0ba735f0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "", "ansible_local": {}, "ansible_loadavg": {"1m": 0.6171875, "5m": 0.56298828125, "15m": 0.37109375}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "22", "epoch": "1727204482", "epoch_int": "1727204482", "date": "2024-09-24", "time": "15:01:22", "iso8601_micro": "2024-09-24T19:01:22.720092Z", "iso8601": "2024-09-24T19:01:22Z", "iso8601_basic": "20240924T150122720092", "iso8601_basic_short": "20240924T150122", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2848, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 869, "free": 2848}, "nocache": {"free": 3468, "used": 249}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 987, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251152920576, "block_size": 4096, "block_total": 64479564, "block_available": 61316631, "block_used": 3162933, "inode_total": 16384000, "inode_available": 16302322, "inode_used": 81678, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 34886 1727204483.19331: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204483.19378: _low_level_execute_command(): starting 34886 1727204483.19465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204481.885179-34925-29945639115466/ > /dev/null 2>&1 && sleep 0' 34886 1727204483.20604: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204483.20616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204483.20628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204483.20647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204483.20661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204483.20850: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204483.20854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204483.20856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34886 1727204483.20859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204483.20861: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34886 1727204483.20863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204483.20866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204483.20868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204483.20871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204483.20873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204483.21091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204483.21155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204483.23276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204483.23645: stderr chunk (state=3): >>><<< 34886 1727204483.23650: stdout chunk (state=3): >>><<< 34886 1727204483.23652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204483.23655: handler run complete 34886 1727204483.24059: variable 'ansible_facts' from source: unknown 34886 1727204483.24522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204483.25848: variable 'ansible_facts' from source: unknown 34886 1727204483.26196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204483.26670: attempt loop complete, returning result 34886 1727204483.26807: _execute() done 34886 1727204483.26817: dumping result to json 34886 1727204483.26866: done dumping result, returning 34886 1727204483.26970: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-04b9-2e74-0000000000b9] 34886 1727204483.26983: sending task result for task 12b410aa-8751-04b9-2e74-0000000000b9 34886 1727204483.29274: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000b9 34886 1727204483.29286: WORKER PROCESS EXITING ok: [managed-node3] 34886 1727204483.29724: no more pending results, returning what we have 34886 1727204483.29728: results queue empty 34886 1727204483.29729: checking for any_errors_fatal 34886 1727204483.29731: done checking for any_errors_fatal 34886 1727204483.29732: checking for max_fail_percentage 34886 1727204483.29734: done checking for max_fail_percentage 34886 1727204483.29735: checking to see if all hosts have failed and the running result is not ok 34886 1727204483.29736: done checking to see if all hosts have failed 34886 1727204483.29737: getting the remaining hosts for this loop 34886 1727204483.29739: done getting the remaining hosts for this loop 34886 1727204483.29744: getting the next task for host managed-node3 34886 1727204483.29751: done getting next task for host managed-node3 34886 1727204483.29753: ^ task is: TASK: meta (flush_handlers) 34886 1727204483.29755: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204483.29759: getting variables 34886 1727204483.29761: in VariableManager get_vars() 34886 1727204483.29986: Calling all_inventory to load vars for managed-node3 34886 1727204483.29993: Calling groups_inventory to load vars for managed-node3 34886 1727204483.29997: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204483.30126: Calling all_plugins_play to load vars for managed-node3 34886 1727204483.30131: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204483.30136: Calling groups_plugins_play to load vars for managed-node3 34886 1727204483.30784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204483.31455: done with get_vars() 34886 1727204483.31470: done getting variables 34886 1727204483.31722: in VariableManager get_vars() 34886 1727204483.31735: Calling all_inventory to load vars for managed-node3 34886 1727204483.31738: Calling groups_inventory to load vars for managed-node3 34886 1727204483.31741: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204483.31862: Calling all_plugins_play to load vars for managed-node3 34886 1727204483.31868: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204483.31873: Calling groups_plugins_play to load vars for managed-node3 34886 1727204483.32487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204483.33086: done with get_vars() 34886 1727204483.33117: done queuing things up, now waiting for results queue to drain 34886 1727204483.33122: results queue empty 34886 1727204483.33124: checking for any_errors_fatal 34886 1727204483.33128: done checking for any_errors_fatal 34886 1727204483.33129: checking for max_fail_percentage 34886 1727204483.33130: done checking for max_fail_percentage 34886 1727204483.33131: checking to see if all hosts have failed and the running result is not ok 34886 1727204483.33132: done checking to see if all hosts have failed 34886 1727204483.33133: getting the remaining hosts for this loop 34886 1727204483.33134: done getting the remaining hosts for this loop 34886 1727204483.33138: getting the next task for host managed-node3 34886 1727204483.33144: done getting next task for host managed-node3 34886 1727204483.33147: ^ task is: TASK: Include the task 'el_repo_setup.yml' 34886 1727204483.33149: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204483.33153: getting variables 34886 1727204483.33154: in VariableManager get_vars() 34886 1727204483.33166: Calling all_inventory to load vars for managed-node3 34886 1727204483.33169: Calling groups_inventory to load vars for managed-node3 34886 1727204483.33172: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204483.33178: Calling all_plugins_play to load vars for managed-node3 34886 1727204483.33182: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204483.33185: Calling groups_plugins_play to load vars for managed-node3 34886 1727204483.33457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204483.34183: done with get_vars() 34886 1727204483.34198: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Tuesday 24 September 2024 15:01:23 -0400 (0:00:01.503) 0:00:01.513 ***** 34886 1727204483.34530: entering _queue_task() for managed-node3/include_tasks 34886 1727204483.34533: Creating lock for include_tasks 34886 1727204483.35215: worker is 1 (out of 1 available) 34886 1727204483.35230: exiting _queue_task() for managed-node3/include_tasks 34886 1727204483.35243: done queuing things up, now waiting for results queue to drain 34886 1727204483.35246: waiting for pending results... 34886 1727204483.35608: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 34886 1727204483.35912: in run() - task 12b410aa-8751-04b9-2e74-000000000006 34886 1727204483.35991: variable 'ansible_search_path' from source: unknown 34886 1727204483.36044: calling self._execute() 34886 1727204483.36283: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204483.36368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204483.36386: variable 'omit' from source: magic vars 34886 1727204483.36702: _execute() done 34886 1727204483.36713: dumping result to json 34886 1727204483.36725: done dumping result, returning 34886 1727204483.36745: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-04b9-2e74-000000000006] 34886 1727204483.36897: sending task result for task 12b410aa-8751-04b9-2e74-000000000006 34886 1727204483.37157: no more pending results, returning what we have 34886 1727204483.37164: in VariableManager get_vars() 34886 1727204483.37212: Calling all_inventory to load vars for managed-node3 34886 1727204483.37216: Calling groups_inventory to load vars for managed-node3 34886 1727204483.37224: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204483.37241: Calling all_plugins_play to load vars for managed-node3 34886 1727204483.37245: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204483.37250: Calling groups_plugins_play to load vars for managed-node3 34886 1727204483.37826: done sending task result for task 12b410aa-8751-04b9-2e74-000000000006 34886 1727204483.37831: WORKER PROCESS EXITING 34886 1727204483.37894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204483.38370: done with get_vars() 34886 1727204483.38380: variable 'ansible_search_path' from source: unknown 34886 1727204483.38398: we have included files to process 34886 1727204483.38400: generating all_blocks data 34886 1727204483.38401: done generating all_blocks data 34886 1727204483.38402: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34886 1727204483.38407: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34886 1727204483.38412: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34886 1727204483.40116: in VariableManager get_vars() 34886 1727204483.40139: done with get_vars() 34886 1727204483.40156: done processing included file 34886 1727204483.40159: iterating over new_blocks loaded from include file 34886 1727204483.40161: in VariableManager get_vars() 34886 1727204483.40173: done with get_vars() 34886 1727204483.40175: filtering new block on tags 34886 1727204483.40196: done filtering new block on tags 34886 1727204483.40200: in VariableManager get_vars() 34886 1727204483.40212: done with get_vars() 34886 1727204483.40214: filtering new block on tags 34886 1727204483.40236: done filtering new block on tags 34886 1727204483.40239: in VariableManager get_vars() 34886 1727204483.40252: done with get_vars() 34886 1727204483.40254: filtering new block on tags 34886 1727204483.40271: done filtering new block on tags 34886 1727204483.40274: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 34886 1727204483.40281: extending task lists for all hosts with included blocks 34886 1727204483.40551: done extending task lists 34886 1727204483.40553: done processing included files 34886 1727204483.40554: results queue empty 34886 1727204483.40555: checking for any_errors_fatal 34886 1727204483.40557: done checking for any_errors_fatal 34886 1727204483.40558: checking for max_fail_percentage 34886 1727204483.40559: done checking for max_fail_percentage 34886 1727204483.40560: checking to see if all hosts have failed and the running result is not ok 34886 1727204483.40561: done checking to see if all hosts have failed 34886 1727204483.40562: getting the remaining hosts for this loop 34886 1727204483.40563: done getting the remaining hosts for this loop 34886 1727204483.40566: getting the next task for host managed-node3 34886 1727204483.40571: done getting next task for host managed-node3 34886 1727204483.40574: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 34886 1727204483.40577: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204483.40579: getting variables 34886 1727204483.40581: in VariableManager get_vars() 34886 1727204483.40593: Calling all_inventory to load vars for managed-node3 34886 1727204483.40596: Calling groups_inventory to load vars for managed-node3 34886 1727204483.40599: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204483.40606: Calling all_plugins_play to load vars for managed-node3 34886 1727204483.40609: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204483.40613: Calling groups_plugins_play to load vars for managed-node3 34886 1727204483.41059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204483.41778: done with get_vars() 34886 1727204483.41791: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:01:23 -0400 (0:00:00.073) 0:00:01.586 ***** 34886 1727204483.41875: entering _queue_task() for managed-node3/setup 34886 1727204483.42641: worker is 1 (out of 1 available) 34886 1727204483.42654: exiting _queue_task() for managed-node3/setup 34886 1727204483.42668: done queuing things up, now waiting for results queue to drain 34886 1727204483.42670: waiting for pending results... 34886 1727204483.43216: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 34886 1727204483.43737: in run() - task 12b410aa-8751-04b9-2e74-0000000000ca 34886 1727204483.43745: variable 'ansible_search_path' from source: unknown 34886 1727204483.43749: variable 'ansible_search_path' from source: unknown 34886 1727204483.43751: calling self._execute() 34886 1727204483.43855: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204483.44063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204483.44067: variable 'omit' from source: magic vars 34886 1727204483.45307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204483.50918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204483.51122: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204483.51172: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204483.51356: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204483.51428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204483.51627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204483.51675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204483.51787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204483.51849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204483.52080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204483.52436: variable 'ansible_facts' from source: unknown 34886 1727204483.52657: variable 'network_test_required_facts' from source: task vars 34886 1727204483.52710: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 34886 1727204483.52805: variable 'omit' from source: magic vars 34886 1727204483.52974: variable 'omit' from source: magic vars 34886 1727204483.53033: variable 'omit' from source: magic vars 34886 1727204483.53078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204483.53279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204483.53282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204483.53285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204483.53287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204483.53415: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204483.53426: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204483.53436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204483.53682: Set connection var ansible_timeout to 10 34886 1727204483.53726: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204483.53733: Set connection var ansible_connection to ssh 34886 1727204483.53747: Set connection var ansible_shell_executable to /bin/sh 34886 1727204483.53807: Set connection var ansible_pipelining to False 34886 1727204483.53821: Set connection var ansible_shell_type to sh 34886 1727204483.53859: variable 'ansible_shell_executable' from source: unknown 34886 1727204483.53903: variable 'ansible_connection' from source: unknown 34886 1727204483.53912: variable 'ansible_module_compression' from source: unknown 34886 1727204483.53940: variable 'ansible_shell_type' from source: unknown 34886 1727204483.54152: variable 'ansible_shell_executable' from source: unknown 34886 1727204483.54155: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204483.54159: variable 'ansible_pipelining' from source: unknown 34886 1727204483.54162: variable 'ansible_timeout' from source: unknown 34886 1727204483.54165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204483.54406: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204483.54587: variable 'omit' from source: magic vars 34886 1727204483.54593: starting attempt loop 34886 1727204483.54595: running the handler 34886 1727204483.54599: _low_level_execute_command(): starting 34886 1727204483.54601: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204483.56354: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204483.56575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204483.56583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204483.56629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204483.58551: stdout chunk (state=3): >>>/root <<< 34886 1727204483.58661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204483.58732: stderr chunk (state=3): >>><<< 34886 1727204483.58851: stdout chunk (state=3): >>><<< 34886 1727204483.58856: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204483.58867: _low_level_execute_command(): starting 34886 1727204483.58903: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869 `" && echo ansible-tmp-1727204483.58831-35204-268958697768869="` echo /root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869 `" ) && sleep 0' 34886 1727204483.60239: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204483.60255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204483.60308: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204483.60546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204483.60554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204483.60558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204483.60722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204483.62598: stdout chunk (state=3): >>>ansible-tmp-1727204483.58831-35204-268958697768869=/root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869 <<< 34886 1727204483.62709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204483.62995: stderr chunk (state=3): >>><<< 34886 1727204483.62998: stdout chunk (state=3): >>><<< 34886 1727204483.63001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204483.58831-35204-268958697768869=/root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204483.63004: variable 'ansible_module_compression' from source: unknown 34886 1727204483.63006: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34886 1727204483.63196: variable 'ansible_facts' from source: unknown 34886 1727204483.63778: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/AnsiballZ_setup.py 34886 1727204483.64134: Sending initial data 34886 1727204483.64145: Sent initial data (152 bytes) 34886 1727204483.65513: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204483.65647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204483.65729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204483.65770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204483.67694: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/AnsiballZ_setup.py" <<< 34886 1727204483.67699: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp0jwz3u52 /root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/AnsiballZ_setup.py <<< 34886 1727204483.67739: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp0jwz3u52" to remote "/root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/AnsiballZ_setup.py" <<< 34886 1727204483.73038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204483.73324: stderr chunk (state=3): >>><<< 34886 1727204483.73328: stdout chunk (state=3): >>><<< 34886 1727204483.73330: done transferring module to remote 34886 1727204483.73334: _low_level_execute_command(): starting 34886 1727204483.73347: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/ /root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/AnsiballZ_setup.py && sleep 0' 34886 1727204483.74560: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204483.74564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204483.74567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204483.74573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204483.74576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204483.74849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204483.74911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204483.76874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204483.76878: stdout chunk (state=3): >>><<< 34886 1727204483.76881: stderr chunk (state=3): >>><<< 34886 1727204483.77195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204483.77199: _low_level_execute_command(): starting 34886 1727204483.77201: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/AnsiballZ_setup.py && sleep 0' 34886 1727204483.78629: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204483.78809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204483.78824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204483.78881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204483.81103: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34886 1727204483.81210: stdout chunk (state=3): >>>import _imp # builtin <<< 34886 1727204483.81213: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 34886 1727204483.81321: stdout chunk (state=3): >>>import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 34886 1727204483.81466: stdout chunk (state=3): >>>import 'time' # <<< 34886 1727204483.81470: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 34886 1727204483.81506: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34886 1727204483.81524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 34886 1727204483.81536: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd00a0d44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd00a0a3ad0> <<< 34886 1727204483.81576: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 34886 1727204483.81711: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd00a0d6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 34886 1727204483.81807: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34886 1727204483.81831: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 34886 1727204483.81856: stdout chunk (state=3): >>>import 'os' # <<< 34886 1727204483.81874: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 34886 1727204483.81929: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 34886 1727204483.81933: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 34886 1727204483.82040: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009e850a0> <<< 34886 1727204483.82312: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009e85fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34886 1727204483.82544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34886 1727204483.82559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34886 1727204483.82595: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204483.82757: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec3e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34886 1727204483.82777: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec3f50> <<< 34886 1727204483.82805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34886 1727204483.82829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34886 1727204483.83002: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34886 1727204483.83006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009efb860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 34886 1727204483.83009: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009efbef0> <<< 34886 1727204483.83022: stdout chunk (state=3): >>>import '_collections' # <<< 34886 1727204483.83066: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009edbb60> <<< 34886 1727204483.83085: stdout chunk (state=3): >>>import '_functools' # <<< 34886 1727204483.83110: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ed9280> <<< 34886 1727204483.83212: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec1040> <<< 34886 1727204483.83236: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34886 1727204483.83257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34886 1727204483.83272: stdout chunk (state=3): >>>import '_sre' # <<< 34886 1727204483.83315: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34886 1727204483.83414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f1f740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f1e360> <<< 34886 1727204483.83435: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009eda270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec2f30> <<< 34886 1727204483.83495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 34886 1727204483.83537: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f50740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34886 1727204483.83642: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f50bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f50aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f50e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ebede0> <<< 34886 1727204483.83664: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204483.83684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34886 1727204483.83751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f51520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f511f0> import 'importlib.machinery' # <<< 34886 1727204483.83779: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34886 1727204483.83802: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f52420> <<< 34886 1727204483.83861: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34886 1727204483.83885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34886 1727204483.83970: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f6c650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204483.84114: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f6dd60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f6ec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f6f2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f6e1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34886 1727204483.84204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f6fd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f6f470> <<< 34886 1727204483.84237: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f52480> <<< 34886 1727204483.84316: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34886 1727204483.84331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34886 1727204483.84362: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ca7cb0> <<< 34886 1727204483.84388: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34886 1727204483.84612: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009cd07a0> <<< 34886 1727204483.84616: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cd0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009cd0620> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009cd0980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ca5e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34886 1727204483.84645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34886 1727204483.84650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34886 1727204483.84675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34886 1727204483.84678: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cd2090> <<< 34886 1727204483.84696: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cd0d10> <<< 34886 1727204483.84733: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f52b70> <<< 34886 1727204483.84844: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34886 1727204483.84860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34886 1727204483.84894: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cfa450> <<< 34886 1727204483.84947: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34886 1727204483.84962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204483.85011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34886 1727204483.85180: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d165a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34886 1727204483.85186: stdout chunk (state=3): >>>import 'ntpath' # <<< 34886 1727204483.85204: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d4f2f0> <<< 34886 1727204483.85312: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34886 1727204483.85326: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34886 1727204483.85421: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d75a90> <<< 34886 1727204483.85497: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d4f410> <<< 34886 1727204483.85608: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d17230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009b503b0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d155e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cd2ff0> <<< 34886 1727204483.85765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34886 1727204483.85783: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd009d15700> <<< 34886 1727204483.85957: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_amhdt7n8/ansible_setup_payload.zip' <<< 34886 1727204483.86042: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.86147: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34886 1727204483.86162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34886 1727204483.86259: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34886 1727204483.86275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34886 1727204483.86312: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bbe030> <<< 34886 1727204483.86326: stdout chunk (state=3): >>>import '_typing' # <<< 34886 1727204483.86525: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009b94f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009b53fb0> <<< 34886 1727204483.86621: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 34886 1727204483.86640: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.88222: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.89748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009b97ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009bedaf0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bed880> <<< 34886 1727204483.89752: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bed190> <<< 34886 1727204483.89755: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34886 1727204483.89758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34886 1727204483.89793: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bed5e0> <<< 34886 1727204483.89907: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bbecc0> <<< 34886 1727204483.89924: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009bee8a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009beeae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34886 1727204483.89950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 34886 1727204483.90063: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009beeff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34886 1727204483.90172: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a54dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009a569f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34886 1727204483.90211: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a57350> <<< 34886 1727204483.90281: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a58530> <<< 34886 1727204483.90302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34886 1727204483.90337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34886 1727204483.90363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34886 1727204483.90422: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5b020> <<< 34886 1727204483.90515: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009a5b140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a592e0> <<< 34886 1727204483.90519: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34886 1727204483.90619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34886 1727204483.90622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34886 1727204483.90637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 34886 1727204483.90714: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5f020> import '_tokenize' # <<< 34886 1727204483.90741: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5daf0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5d850> <<< 34886 1727204483.90828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34886 1727204483.90853: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5fc20> <<< 34886 1727204483.90884: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a597f0> <<< 34886 1727204483.90954: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009aa31d0> <<< 34886 1727204483.90958: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aa3320> <<< 34886 1727204483.91046: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34886 1727204483.91111: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009aacf20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aaccb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34886 1727204483.91203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34886 1727204483.91287: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009aaf470> <<< 34886 1727204483.91293: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aad5e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34886 1727204483.91381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 34886 1727204483.91485: stdout chunk (state=3): >>>import '_string' # <<< 34886 1727204483.91488: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ab2bd0> <<< 34886 1727204483.91598: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aaf560> <<< 34886 1727204483.91671: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab3920> <<< 34886 1727204483.91922: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab3c80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab2f90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aa3650> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab6b40> <<< 34886 1727204483.92127: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204483.92131: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab80b0> <<< 34886 1727204483.92150: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ab52e0> <<< 34886 1727204483.92174: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204483.92188: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab6690> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ab4ec0> <<< 34886 1727204483.92260: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 34886 1727204483.92341: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.92445: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.92475: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34886 1727204483.92611: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 34886 1727204483.92670: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.92815: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.93499: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.94192: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34886 1727204483.94264: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204483.94307: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009940320> <<< 34886 1727204483.94430: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34886 1727204483.94434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34886 1727204483.94450: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009941160> <<< 34886 1727204483.94530: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009abab70> <<< 34886 1727204483.94534: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34886 1727204483.94536: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.94612: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34886 1727204483.94760: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.95049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34886 1727204483.95066: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009941310> # zipimport: zlib available <<< 34886 1727204483.95544: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.96206: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34886 1727204483.96273: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34886 1727204483.96287: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.96334: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.96449: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34886 1727204483.96512: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34886 1727204483.96578: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34886 1727204483.96672: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 34886 1727204483.96686: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.96723: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34886 1727204483.96741: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.97019: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.97320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34886 1727204483.97381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34886 1727204483.97395: stdout chunk (state=3): >>>import '_ast' # <<< 34886 1727204483.97494: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009943590> # zipimport: zlib available <<< 34886 1727204483.97671: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.97675: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 34886 1727204483.97682: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 34886 1727204483.97746: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34886 1727204483.97760: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34886 1727204483.97804: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204483.97927: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009949cd0> <<< 34886 1727204483.98085: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd00994a630> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ab5250> # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204483.98109: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34886 1727204483.98122: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.98167: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.98214: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.98310: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.98352: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34886 1727204483.98423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204483.98714: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009949430> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd00994a8a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204483.98744: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.98818: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204483.98821: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34886 1727204483.98844: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34886 1727204483.99094: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34886 1727204483.99114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099dea50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099546e0> <<< 34886 1727204483.99171: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099528a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009952660> # destroy ansible.module_utils.distro <<< 34886 1727204483.99175: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 34886 1727204483.99177: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99215: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99236: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34886 1727204483.99336: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 34886 1727204483.99350: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99416: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99485: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99506: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99532: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99577: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99670: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34886 1727204483.99706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 34886 1727204483.99771: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99799: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99881: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204483.99991: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 34886 1727204484.00204: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.00346: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.00395: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.00454: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 34886 1727204484.00547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 34886 1727204484.00555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34886 1727204484.00642: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099e1310> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34886 1727204484.00745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f0c1d0> <<< 34886 1727204484.00767: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204484.00870: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008f0c5c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099b9250> <<< 34886 1727204484.00874: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099b8800> <<< 34886 1727204484.00963: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099e36e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099e2db0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34886 1727204484.01000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34886 1727204484.01019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34886 1727204484.01033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 34886 1727204484.01054: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 34886 1727204484.01102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008f0f500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f0edb0> <<< 34886 1727204484.01318: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008f0ef90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f0e1e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f0f6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34886 1727204484.01353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34886 1727204484.01381: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008f7a210> <<< 34886 1727204484.01527: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f78230> <<< 34886 1727204484.01531: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099e2f90> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 34886 1727204484.01534: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.01583: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.01644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34886 1727204484.01658: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.01727: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.01773: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34886 1727204484.01790: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.01835: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 34886 1727204484.01863: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.01904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 34886 1727204484.02042: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.02070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 34886 1727204484.02077: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.02112: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34886 1727204484.02159: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.02186: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.02248: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.02310: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.02376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 34886 1727204484.02476: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.02947: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.03445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 34886 1727204484.03448: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.03500: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.03559: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.03656: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 34886 1727204484.03681: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.03713: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 34886 1727204484.03764: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.03788: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.03842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34886 1727204484.03984: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.04112: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204484.04201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 34886 1727204484.04215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34886 1727204484.04228: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f7a540> <<< 34886 1727204484.04248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34886 1727204484.04420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f7b1a0> import 'ansible.module_utils.facts.system.local' # <<< 34886 1727204484.04429: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.04500: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.04588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 34886 1727204484.04593: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.04679: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.04778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 34886 1727204484.04794: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.04857: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.05061: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204484.05066: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34886 1727204484.05093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34886 1727204484.05170: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204484.05238: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008faa630><<< 34886 1727204484.05278: stdout chunk (state=3): >>> <<< 34886 1727204484.05453: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f92480> <<< 34886 1727204484.05471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 34886 1727204484.05533: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.05607: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34886 1727204484.05721: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34886 1727204484.05790: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.05932: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.06078: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 34886 1727204484.06161: stdout chunk (state=3): >>># zipimport: zlib available<<< 34886 1727204484.06166: stdout chunk (state=3): >>> <<< 34886 1727204484.06169: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.06313: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 34886 1727204484.06345: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204484.06357: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008d61f10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f935f0> import 'ansible.module_utils.facts.system.user' # <<< 34886 1727204484.06393: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.06489: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 34886 1727204484.06492: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.06497: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.06500: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 34886 1727204484.06511: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.06679: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.06846: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 34886 1727204484.06861: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.06970: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.07108: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.07165: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.07172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 34886 1727204484.07192: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.07214: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.07238: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.07486: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.07557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 34886 1727204484.07609: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 34886 1727204484.07713: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.07894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34886 1727204484.08003: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.08019: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34886 1727204484.08556: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.09133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 34886 1727204484.09151: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 34886 1727204484.09263: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.09382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34886 1727204484.09467: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.09500: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.09615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34886 1727204484.09625: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.09791: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.09965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34886 1727204484.10015: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.10019: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 34886 1727204484.10027: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.10057: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.10105: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 34886 1727204484.10137: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.10225: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.10354: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.10561: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.10786: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 34886 1727204484.11004: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11026: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34886 1727204484.11038: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34886 1727204484.11125: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11147: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11173: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 34886 1727204484.11188: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11249: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 34886 1727204484.11326: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11462: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11465: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 34886 1727204484.11468: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.11756: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.12060: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 34886 1727204484.12186: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.12205: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 34886 1727204484.12238: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.12279: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 34886 1727204484.12509: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 34886 1727204484.12536: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.12794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 34886 1727204484.12797: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.12800: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 34886 1727204484.12803: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.12805: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.12807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 34886 1727204484.12811: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.13012: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34886 1727204484.13093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 34886 1727204484.13109: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 34886 1727204484.13124: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.13172: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.13225: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 34886 1727204484.13237: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.13456: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.13690: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34886 1727204484.13695: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.13799: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.13803: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 34886 1727204484.13909: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 34886 1727204484.14000: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.14092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 34886 1727204484.14109: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.14206: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.14308: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34886 1727204484.14406: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.15053: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 34886 1727204484.15065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34886 1727204484.15092: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 34886 1727204484.15112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34886 1727204484.15146: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204484.15166: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008d8be00> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008d88aa0> <<< 34886 1727204484.15308: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008d89f10> <<< 34886 1727204484.15917: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_syste<<< 34886 1727204484.15930: stdout chunk (state=3): >>>m": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "24", "epoch": "1727204484", "epoch_int": "1727204484", "date": "2024-09-24", "time": "15:01:24", "iso8601_micro": "2024-09-24T19:01:24.157077Z", "iso8601": "2024-09-24T19:01:24Z", "iso8601_basic": "20240924T150124157077", "iso8601_basic_short": "20240924T150124", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34886 1727204484.16525: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 34886 1727204484.16536: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ <<< 34886 1727204484.16540: stdout chunk (state=3): >>># clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 34886 1727204484.16594: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 34886 1727204484.16598: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma <<< 34886 1727204484.16623: stdout chunk (state=3): >>># cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 34886 1727204484.16635: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime <<< 34886 1727204484.16852: stdout chunk (state=3): >>># cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 34886 1727204484.17177: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34886 1727204484.17184: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34886 1727204484.17189: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 34886 1727204484.17394: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34886 1727204484.17398: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34886 1727204484.17401: stdout chunk (state=3): >>># destroy ntpath <<< 34886 1727204484.17403: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 34886 1727204484.17405: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 34886 1727204484.17412: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select <<< 34886 1727204484.17422: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34886 1727204484.17443: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors<<< 34886 1727204484.17470: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 34886 1727204484.17514: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata <<< 34886 1727204484.17801: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 34886 1727204484.17809: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 34886 1727204484.17838: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 34886 1727204484.17869: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 34886 1727204484.17895: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 34886 1727204484.17910: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34886 1727204484.18062: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 34886 1727204484.18074: stdout chunk (state=3): >>># destroy _collections <<< 34886 1727204484.18103: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 34886 1727204484.18117: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34886 1727204484.18139: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34886 1727204484.18174: stdout chunk (state=3): >>># destroy _typing <<< 34886 1727204484.18194: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34886 1727204484.18206: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 34886 1727204484.18225: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34886 1727204484.18316: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 34886 1727204484.18335: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34886 1727204484.18355: stdout chunk (state=3): >>># destroy _random <<< 34886 1727204484.18603: stdout chunk (state=3): >>># destroy _weakref <<< 34886 1727204484.18634: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34886 1727204484.18976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204484.18995: stdout chunk (state=3): >>><<< 34886 1727204484.18998: stderr chunk (state=3): >>><<< 34886 1727204484.19199: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd00a0d44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd00a0a3ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd00a0d6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009e850a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009e85fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec3e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec3f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009efb860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009efbef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009edbb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ed9280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec1040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f1f740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f1e360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009eda270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec2f30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f50740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ec02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f50bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f50aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f50e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ebede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f51520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f511f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f52420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f6c650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f6dd60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f6ec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f6f2c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f6e1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009f6fd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f6f470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f52480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ca7cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009cd07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cd0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009cd0620> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009cd0980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ca5e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cd2090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cd0d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009f52b70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cfa450> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d165a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d4f2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d75a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d4f410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d17230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009b503b0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009d155e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009cd2ff0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd009d15700> # zipimport: found 103 names in '/tmp/ansible_setup_payload_amhdt7n8/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bbe030> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009b94f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009b53fb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009b97ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009bedaf0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bed880> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bed190> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bed5e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009bbecc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009bee8a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009beeae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009beeff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a54dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009a569f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a57350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a58530> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5b020> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009a5b140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a592e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5f020> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5daf0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5d850> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a5fc20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009a597f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009aa31d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aa3320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009aacf20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aaccb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009aaf470> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aad5e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ab2bd0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aaf560> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab3920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab3c80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab2f90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009aa3650> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab6b40> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab80b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ab52e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009ab6690> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ab4ec0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009940320> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009941160> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009abab70> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009941310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009943590> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009949cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd00994a630> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009ab5250> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd009949430> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd00994a8a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099dea50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099546e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099528a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd009952660> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099e1310> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f0c1d0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008f0c5c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099b9250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099b8800> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099e36e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099e2db0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008f0f500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f0edb0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008f0ef90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f0e1e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f0f6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008f7a210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f78230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0099e2f90> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f7a540> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f7b1a0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008faa630> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f92480> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008d61f10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008f935f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd008d8be00> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008d88aa0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd008d89f10> {"ansible_facts": {"ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "24", "epoch": "1727204484", "epoch_int": "1727204484", "date": "2024-09-24", "time": "15:01:24", "iso8601_micro": "2024-09-24T19:01:24.157077Z", "iso8601": "2024-09-24T19:01:24Z", "iso8601_basic": "20240924T150124157077", "iso8601_basic_short": "20240924T150124", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34886 1727204484.21102: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204484.21106: _low_level_execute_command(): starting 34886 1727204484.21108: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204483.58831-35204-268958697768869/ > /dev/null 2>&1 && sleep 0' 34886 1727204484.21111: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204484.21494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204484.21497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204484.21500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204484.21502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204484.21505: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204484.21507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.21509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34886 1727204484.21511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204484.21513: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34886 1727204484.21515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204484.21517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204484.21519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204484.21521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204484.21523: stderr chunk (state=3): >>>debug2: match found <<< 34886 1727204484.21526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.21534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204484.21555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204484.21569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204484.21792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204484.23574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204484.23660: stderr chunk (state=3): >>><<< 34886 1727204484.23715: stdout chunk (state=3): >>><<< 34886 1727204484.23826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204484.23832: handler run complete 34886 1727204484.24003: variable 'ansible_facts' from source: unknown 34886 1727204484.24071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204484.24452: variable 'ansible_facts' from source: unknown 34886 1727204484.24587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204484.24612: attempt loop complete, returning result 34886 1727204484.24616: _execute() done 34886 1727204484.24621: dumping result to json 34886 1727204484.24636: done dumping result, returning 34886 1727204484.24646: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-04b9-2e74-0000000000ca] 34886 1727204484.24654: sending task result for task 12b410aa-8751-04b9-2e74-0000000000ca ok: [managed-node3] 34886 1727204484.25314: no more pending results, returning what we have 34886 1727204484.25318: results queue empty 34886 1727204484.25321: checking for any_errors_fatal 34886 1727204484.25323: done checking for any_errors_fatal 34886 1727204484.25324: checking for max_fail_percentage 34886 1727204484.25325: done checking for max_fail_percentage 34886 1727204484.25326: checking to see if all hosts have failed and the running result is not ok 34886 1727204484.25327: done checking to see if all hosts have failed 34886 1727204484.25328: getting the remaining hosts for this loop 34886 1727204484.25329: done getting the remaining hosts for this loop 34886 1727204484.25334: getting the next task for host managed-node3 34886 1727204484.25343: done getting next task for host managed-node3 34886 1727204484.25346: ^ task is: TASK: Check if system is ostree 34886 1727204484.25349: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204484.25353: getting variables 34886 1727204484.25355: in VariableManager get_vars() 34886 1727204484.25385: Calling all_inventory to load vars for managed-node3 34886 1727204484.25590: Calling groups_inventory to load vars for managed-node3 34886 1727204484.25597: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204484.25614: Calling all_plugins_play to load vars for managed-node3 34886 1727204484.25621: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204484.25626: Calling groups_plugins_play to load vars for managed-node3 34886 1727204484.26274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204484.26918: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000ca 34886 1727204484.26925: WORKER PROCESS EXITING 34886 1727204484.26934: done with get_vars() 34886 1727204484.26948: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:01:24 -0400 (0:00:00.852) 0:00:02.439 ***** 34886 1727204484.27148: entering _queue_task() for managed-node3/stat 34886 1727204484.27832: worker is 1 (out of 1 available) 34886 1727204484.27845: exiting _queue_task() for managed-node3/stat 34886 1727204484.27860: done queuing things up, now waiting for results queue to drain 34886 1727204484.27862: waiting for pending results... 34886 1727204484.28227: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 34886 1727204484.28353: in run() - task 12b410aa-8751-04b9-2e74-0000000000cc 34886 1727204484.28513: variable 'ansible_search_path' from source: unknown 34886 1727204484.28522: variable 'ansible_search_path' from source: unknown 34886 1727204484.28567: calling self._execute() 34886 1727204484.28994: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204484.29000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204484.29003: variable 'omit' from source: magic vars 34886 1727204484.29845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204484.30378: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204484.30647: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204484.30695: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204484.30761: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204484.31093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204484.31129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204484.31167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204484.31205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204484.31547: Evaluated conditional (not __network_is_ostree is defined): True 34886 1727204484.31562: variable 'omit' from source: magic vars 34886 1727204484.31618: variable 'omit' from source: magic vars 34886 1727204484.31664: variable 'omit' from source: magic vars 34886 1727204484.32094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204484.32098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204484.32101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204484.32103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204484.32105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204484.32107: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204484.32110: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204484.32112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204484.32199: Set connection var ansible_timeout to 10 34886 1727204484.32406: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204484.32415: Set connection var ansible_connection to ssh 34886 1727204484.32429: Set connection var ansible_shell_executable to /bin/sh 34886 1727204484.32445: Set connection var ansible_pipelining to False 34886 1727204484.32453: Set connection var ansible_shell_type to sh 34886 1727204484.32488: variable 'ansible_shell_executable' from source: unknown 34886 1727204484.32502: variable 'ansible_connection' from source: unknown 34886 1727204484.32511: variable 'ansible_module_compression' from source: unknown 34886 1727204484.32520: variable 'ansible_shell_type' from source: unknown 34886 1727204484.32528: variable 'ansible_shell_executable' from source: unknown 34886 1727204484.32536: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204484.32545: variable 'ansible_pipelining' from source: unknown 34886 1727204484.32554: variable 'ansible_timeout' from source: unknown 34886 1727204484.32564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204484.32961: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204484.32978: variable 'omit' from source: magic vars 34886 1727204484.32988: starting attempt loop 34886 1727204484.32999: running the handler 34886 1727204484.33020: _low_level_execute_command(): starting 34886 1727204484.33034: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204484.34256: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204484.34400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.34403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204484.34406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.34491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204484.34604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204484.34665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204484.36518: stdout chunk (state=3): >>>/root <<< 34886 1727204484.36582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204484.36871: stderr chunk (state=3): >>><<< 34886 1727204484.36874: stdout chunk (state=3): >>><<< 34886 1727204484.36880: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204484.36893: _low_level_execute_command(): starting 34886 1727204484.36896: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383 `" && echo ansible-tmp-1727204484.3684309-35292-200700888934383="` echo /root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383 `" ) && sleep 0' 34886 1727204484.38290: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204484.38296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204484.38299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.38307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204484.38318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.38364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204484.38494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204484.38627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204484.40692: stdout chunk (state=3): >>>ansible-tmp-1727204484.3684309-35292-200700888934383=/root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383 <<< 34886 1727204484.40801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204484.40991: stderr chunk (state=3): >>><<< 34886 1727204484.41013: stdout chunk (state=3): >>><<< 34886 1727204484.41297: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204484.3684309-35292-200700888934383=/root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204484.41301: variable 'ansible_module_compression' from source: unknown 34886 1727204484.41304: ANSIBALLZ: Using lock for stat 34886 1727204484.41306: ANSIBALLZ: Acquiring lock 34886 1727204484.41309: ANSIBALLZ: Lock acquired: 139734986903616 34886 1727204484.41311: ANSIBALLZ: Creating module 34886 1727204484.70685: ANSIBALLZ: Writing module into payload 34886 1727204484.70823: ANSIBALLZ: Writing module 34886 1727204484.70859: ANSIBALLZ: Renaming module 34886 1727204484.70872: ANSIBALLZ: Done creating module 34886 1727204484.70901: variable 'ansible_facts' from source: unknown 34886 1727204484.71001: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/AnsiballZ_stat.py 34886 1727204484.71187: Sending initial data 34886 1727204484.71202: Sent initial data (153 bytes) 34886 1727204484.71869: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204484.71945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.72007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204484.72038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204484.72058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204484.72178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204484.73929: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34886 1727204484.73946: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 34886 1727204484.73958: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 34886 1727204484.73971: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 34886 1727204484.74011: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204484.74124: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204484.74163: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpoltmwe7x /root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/AnsiballZ_stat.py <<< 34886 1727204484.74167: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/AnsiballZ_stat.py" <<< 34886 1727204484.74440: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpoltmwe7x" to remote "/root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/AnsiballZ_stat.py" <<< 34886 1727204484.76222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204484.76329: stderr chunk (state=3): >>><<< 34886 1727204484.76340: stdout chunk (state=3): >>><<< 34886 1727204484.76376: done transferring module to remote 34886 1727204484.76416: _low_level_execute_command(): starting 34886 1727204484.76686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/ /root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/AnsiballZ_stat.py && sleep 0' 34886 1727204484.77974: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204484.77990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204484.78008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.78121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204484.78135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204484.78198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204484.80091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204484.80230: stderr chunk (state=3): >>><<< 34886 1727204484.80241: stdout chunk (state=3): >>><<< 34886 1727204484.80266: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204484.80281: _low_level_execute_command(): starting 34886 1727204484.80476: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/AnsiballZ_stat.py && sleep 0' 34886 1727204484.81607: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204484.81610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204484.81797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204484.81824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204484.81837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204484.81959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204484.82023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204484.84201: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34886 1727204484.84296: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # <<< 34886 1727204484.84366: stdout chunk (state=3): >>>import '_weakref' # import '_io' # import 'marshal' # <<< 34886 1727204484.84387: stdout chunk (state=3): >>>import 'posix' # <<< 34886 1727204484.84553: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204484.84557: stdout chunk (state=3): >>>import '_codecs' # <<< 34886 1727204484.84578: stdout chunk (state=3): >>>import 'codecs' # <<< 34886 1727204484.84611: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34886 1727204484.84639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 34886 1727204484.84962: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929eb44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929e83ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 34886 1727204484.84975: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929eb6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 34886 1727204484.84978: stdout chunk (state=3): >>>import 'os' # <<< 34886 1727204484.85035: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 34886 1727204484.85047: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34886 1727204484.85075: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34886 1727204484.85090: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929c690a0> <<< 34886 1727204484.85156: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34886 1727204484.85178: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929c69fd0> <<< 34886 1727204484.85203: stdout chunk (state=3): >>>import 'site' # <<< 34886 1727204484.85234: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34886 1727204484.85482: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34886 1727204484.85513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34886 1727204484.85540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204484.85554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34886 1727204484.85610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34886 1727204484.85624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34886 1727204484.85676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34886 1727204484.85683: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca7e90> <<< 34886 1727204484.85686: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34886 1727204484.85719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34886 1727204484.85739: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca7f50> <<< 34886 1727204484.85742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34886 1727204484.85769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34886 1727204484.85803: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34886 1727204484.85847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204484.85875: stdout chunk (state=3): >>>import 'itertools' # <<< 34886 1727204484.85997: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cdf890> <<< 34886 1727204484.86028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cdff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cbfb60> import '_functools' # <<< 34886 1727204484.86059: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cbd280> <<< 34886 1727204484.86160: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca5040> <<< 34886 1727204484.86197: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34886 1727204484.86254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34886 1727204484.86347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34886 1727204484.86380: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d037d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d023f0> <<< 34886 1727204484.86401: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cbe270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d00c20> <<< 34886 1727204484.86488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d34770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca42c0> <<< 34886 1727204484.86523: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34886 1727204484.86588: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d34c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d34ad0> <<< 34886 1727204484.86619: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d34ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca2de0> <<< 34886 1727204484.86720: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204484.86724: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d355b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d35280> import 'importlib.machinery' # <<< 34886 1727204484.86808: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34886 1727204484.86812: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d364b0> import 'importlib.util' # <<< 34886 1727204484.86843: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34886 1727204484.86897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34886 1727204484.86937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d506e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204484.87040: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d51e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 34886 1727204484.87044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d52cf0> <<< 34886 1727204484.87094: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d53350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d52270> <<< 34886 1727204484.87097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34886 1727204484.87223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34886 1727204484.87227: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204484.87249: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d53dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d53500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d36510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34886 1727204484.87276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34886 1727204484.87425: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34886 1727204484.87429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34886 1727204484.87460: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929b1bce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929b447d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b44530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929b44800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929b449e0> <<< 34886 1727204484.87498: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b19e80> <<< 34886 1727204484.87503: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34886 1727204484.87613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34886 1727204484.87651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34886 1727204484.87695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b460f0> <<< 34886 1727204484.87728: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b44d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d36c00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34886 1727204484.87788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204484.87881: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34886 1727204484.87885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34886 1727204484.87887: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b6e450> <<< 34886 1727204484.87994: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34886 1727204484.87997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34886 1727204484.88057: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b8a5a0> <<< 34886 1727204484.88114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34886 1727204484.88118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34886 1727204484.88169: stdout chunk (state=3): >>>import 'ntpath' # <<< 34886 1727204484.88335: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929bc32c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34886 1727204484.88339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34886 1727204484.88424: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929be9a60> <<< 34886 1727204484.88500: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929bc33e0> <<< 34886 1727204484.88543: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b8b230> <<< 34886 1727204484.88582: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299c4410> <<< 34886 1727204484.88817: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b895e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b46ff0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6929b89700> # zipimport: found 30 names in '/tmp/ansible_stat_payload_bqd4hcn8/ansible_stat_payload.zip' # zipimport: zlib available <<< 34886 1727204484.88961: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.88995: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34886 1727204484.89017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34886 1727204484.89045: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34886 1727204484.89130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34886 1727204484.89161: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a1e0f0> <<< 34886 1727204484.89179: stdout chunk (state=3): >>>import '_typing' # <<< 34886 1727204484.89376: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299f4fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299f4170> <<< 34886 1727204484.89392: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.89426: stdout chunk (state=3): >>>import 'ansible' # <<< 34886 1727204484.89454: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34886 1727204484.89483: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 34886 1727204484.89498: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.91062: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.92399: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299f7f80> <<< 34886 1727204484.92405: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204484.92451: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34886 1727204484.92454: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 34886 1727204484.92459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34886 1727204484.92494: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929a49af0> <<< 34886 1727204484.92528: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a49880> <<< 34886 1727204484.92561: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a49190> <<< 34886 1727204484.92586: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34886 1727204484.92609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34886 1727204484.92640: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a49670> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a1ed80> <<< 34886 1727204484.92651: stdout chunk (state=3): >>>import 'atexit' # <<< 34886 1727204484.92676: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929a4a810> <<< 34886 1727204484.92718: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929a4aa50> <<< 34886 1727204484.92732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34886 1727204484.92779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34886 1727204484.92797: stdout chunk (state=3): >>>import '_locale' # <<< 34886 1727204484.92841: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a4af30> <<< 34886 1727204484.92860: stdout chunk (state=3): >>>import 'pwd' # <<< 34886 1727204484.92877: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34886 1727204484.92899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34886 1727204484.92938: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298acd10> <<< 34886 1727204484.92976: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204484.92979: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69298ae930> <<< 34886 1727204484.93014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34886 1727204484.93024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34886 1727204484.93047: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298af2f0> <<< 34886 1727204484.93071: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34886 1727204484.93112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34886 1727204484.93144: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b04d0> <<< 34886 1727204484.93147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34886 1727204484.93205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34886 1727204484.93210: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 34886 1727204484.93228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34886 1727204484.93269: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b2f90> <<< 34886 1727204484.93320: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69298b32f0> <<< 34886 1727204484.93352: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b1250> <<< 34886 1727204484.93355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34886 1727204484.93394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34886 1727204484.93427: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34886 1727204484.93434: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34886 1727204484.93488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34886 1727204484.93494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 34886 1727204484.93519: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b6f30> import '_tokenize' # <<< 34886 1727204484.93596: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b5a30> <<< 34886 1727204484.93616: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b5790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34886 1727204484.93640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34886 1727204484.93705: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b7dd0> <<< 34886 1727204484.93755: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b1760> <<< 34886 1727204484.93779: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69298ff020> <<< 34886 1727204484.93810: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298ff1a0> <<< 34886 1727204484.93827: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34886 1727204484.93859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34886 1727204484.93876: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34886 1727204484.93909: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929900d70> <<< 34886 1727204484.93934: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929900b30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34886 1727204484.94053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34886 1727204484.94119: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929903260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299013d0> <<< 34886 1727204484.94136: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34886 1727204484.94197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204484.94241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34886 1727204484.94245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34886 1727204484.94293: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692990aa80> <<< 34886 1727204484.94697: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929903410> <<< 34886 1727204484.94860: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990b8c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990ba70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990bd10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298ff4a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990f500> <<< 34886 1727204484.94960: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204484.95103: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69299103e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692990dc70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990eff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692990d850> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 34886 1727204484.95196: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.95297: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.95418: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34886 1727204484.95423: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 34886 1727204484.95518: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.95707: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.96340: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.97031: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 34886 1727204484.97088: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34886 1727204484.97168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69299986b0> <<< 34886 1727204484.97279: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34886 1727204484.97392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34886 1727204484.97396: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929999610> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929912f30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 34886 1727204484.97446: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34886 1727204484.97631: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.97832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34886 1727204484.97921: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929999c10> <<< 34886 1727204484.97939: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.98417: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.99023: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.99058: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.99160: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34886 1727204484.99164: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.99205: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.99368: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34886 1727204484.99372: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34886 1727204484.99457: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34886 1727204484.99499: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.99512: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 34886 1727204484.99616: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 34886 1727204484.99620: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204484.99934: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.00186: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34886 1727204485.00267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34886 1727204485.00282: stdout chunk (state=3): >>>import '_ast' # <<< 34886 1727204485.00384: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692999bfe0> # zipimport: zlib available <<< 34886 1727204485.00462: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.00550: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 34886 1727204485.00613: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 34886 1727204485.00617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34886 1727204485.00837: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204485.00840: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69297a5eb0> <<< 34886 1727204485.00905: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 34886 1727204485.00909: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69297a67b0> <<< 34886 1727204485.00913: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692999b140> <<< 34886 1727204485.00936: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.01015: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 34886 1727204485.01019: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.01068: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.01157: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.01259: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34886 1727204485.01305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34886 1727204485.01395: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69297a55e0> <<< 34886 1727204485.01440: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69297a6990> <<< 34886 1727204485.01487: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 34886 1727204485.01596: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.01601: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.01625: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.01724: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 34886 1727204485.01727: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34886 1727204485.01753: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34886 1727204485.01845: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34886 1727204485.01848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34886 1727204485.01936: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929836c30> <<< 34886 1727204485.01986: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69297b0980> <<< 34886 1727204485.02090: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69297ae9f0> <<< 34886 1727204485.02094: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69297ae840> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34886 1727204485.02096: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.02229: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34886 1727204485.02233: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34886 1727204485.02239: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.02257: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 34886 1727204485.02516: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34886 1727204485.02632: stdout chunk (state=3): >>># zipimport: zlib available <<< 34886 1727204485.02777: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 34886 1727204485.03127: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 34886 1727204485.03142: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 34886 1727204485.03180: stdout chunk (state=3): >>> # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 34886 1727204485.03223: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 34886 1727204485.03227: stdout chunk (state=3): >>># cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale <<< 34886 1727204485.03239: stdout chunk (state=3): >>># cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess <<< 34886 1727204485.03255: stdout chunk (state=3): >>># cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 <<< 34886 1727204485.03281: stdout chunk (state=3): >>># cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text <<< 34886 1727204485.03313: stdout chunk (state=3): >>># destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34886 1727204485.03568: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34886 1727204485.03572: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34886 1727204485.03596: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 34886 1727204485.03608: stdout chunk (state=3): >>># destroy binascii # destroy struct<<< 34886 1727204485.03716: stdout chunk (state=3): >>> # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34886 1727204485.03821: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 34886 1727204485.03856: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 34886 1727204485.03878: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 34886 1727204485.03960: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 34886 1727204485.04086: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 34886 1727204485.04092: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34886 1727204485.04214: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 34886 1727204485.04308: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34886 1727204485.04349: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 34886 1727204485.04354: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34886 1727204485.04357: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 34886 1727204485.04373: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34886 1727204485.04459: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 34886 1727204485.04522: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 34886 1727204485.04632: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34886 1727204485.04968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204485.05202: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 34886 1727204485.05218: stderr chunk (state=3): >>><<< 34886 1727204485.05239: stdout chunk (state=3): >>><<< 34886 1727204485.05435: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929eb44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929e83ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929eb6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929c690a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929c69fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca7f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cdf890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cdff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cbfb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cbd280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca5040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d037d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d023f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929cbe270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d00c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d34770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca42c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d34c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d34ad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d34ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929ca2de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d355b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d35280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d364b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d506e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d51e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d52cf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d53350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d52270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929d53dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d53500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d36510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929b1bce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929b447d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b44530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929b44800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929b449e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b19e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b460f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b44d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929d36c00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b6e450> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b8a5a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929bc32c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929be9a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929bc33e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b8b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299c4410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b895e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929b46ff0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6929b89700> # zipimport: found 30 names in '/tmp/ansible_stat_payload_bqd4hcn8/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a1e0f0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299f4fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299f4170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299f7f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929a49af0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a49880> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a49190> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a49670> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a1ed80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929a4a810> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929a4aa50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929a4af30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298acd10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69298ae930> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298af2f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b04d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b2f90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69298b32f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b1250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b6f30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b5a30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b5790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b7dd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298b1760> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69298ff020> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298ff1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929900d70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929900b30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6929903260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69299013d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692990aa80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929903410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990b8c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990ba70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990bd10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69298ff4a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990f500> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69299103e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692990dc70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f692990eff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692990d850> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69299986b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929999610> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929912f30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929999c10> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692999bfe0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69297a5eb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69297a67b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f692999b140> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f69297a55e0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69297a6990> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6929836c30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69297b0980> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69297ae9f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f69297ae840> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34886 1727204485.06700: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204485.06704: _low_level_execute_command(): starting 34886 1727204485.06707: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204484.3684309-35292-200700888934383/ > /dev/null 2>&1 && sleep 0' 34886 1727204485.06710: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204485.06712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204485.06715: stderr chunk (state=3): >>>debug2: match found <<< 34886 1727204485.06717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.06719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204485.06722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204485.06725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204485.09056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204485.09183: stderr chunk (state=3): >>><<< 34886 1727204485.09187: stdout chunk (state=3): >>><<< 34886 1727204485.09222: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204485.09314: handler run complete 34886 1727204485.09317: attempt loop complete, returning result 34886 1727204485.09320: _execute() done 34886 1727204485.09322: dumping result to json 34886 1727204485.09326: done dumping result, returning 34886 1727204485.09329: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [12b410aa-8751-04b9-2e74-0000000000cc] 34886 1727204485.09331: sending task result for task 12b410aa-8751-04b9-2e74-0000000000cc 34886 1727204485.09595: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000cc 34886 1727204485.09598: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 34886 1727204485.09668: no more pending results, returning what we have 34886 1727204485.09671: results queue empty 34886 1727204485.09672: checking for any_errors_fatal 34886 1727204485.09680: done checking for any_errors_fatal 34886 1727204485.09681: checking for max_fail_percentage 34886 1727204485.09683: done checking for max_fail_percentage 34886 1727204485.09684: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.09685: done checking to see if all hosts have failed 34886 1727204485.09686: getting the remaining hosts for this loop 34886 1727204485.09688: done getting the remaining hosts for this loop 34886 1727204485.09694: getting the next task for host managed-node3 34886 1727204485.09699: done getting next task for host managed-node3 34886 1727204485.09702: ^ task is: TASK: Set flag to indicate system is ostree 34886 1727204485.09705: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.09718: getting variables 34886 1727204485.09720: in VariableManager get_vars() 34886 1727204485.09751: Calling all_inventory to load vars for managed-node3 34886 1727204485.09755: Calling groups_inventory to load vars for managed-node3 34886 1727204485.09759: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.09771: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.09775: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.09779: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.10072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.10424: done with get_vars() 34886 1727204485.10437: done getting variables 34886 1727204485.10559: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.834) 0:00:03.273 ***** 34886 1727204485.10605: entering _queue_task() for managed-node3/set_fact 34886 1727204485.10607: Creating lock for set_fact 34886 1727204485.10955: worker is 1 (out of 1 available) 34886 1727204485.10969: exiting _queue_task() for managed-node3/set_fact 34886 1727204485.10982: done queuing things up, now waiting for results queue to drain 34886 1727204485.10984: waiting for pending results... 34886 1727204485.11811: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 34886 1727204485.11818: in run() - task 12b410aa-8751-04b9-2e74-0000000000cd 34886 1727204485.12196: variable 'ansible_search_path' from source: unknown 34886 1727204485.12199: variable 'ansible_search_path' from source: unknown 34886 1727204485.12202: calling self._execute() 34886 1727204485.12222: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.12237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.12255: variable 'omit' from source: magic vars 34886 1727204485.13898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204485.14408: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204485.14662: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204485.14666: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204485.14795: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204485.15095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204485.15098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204485.15101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204485.15331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204485.15557: Evaluated conditional (not __network_is_ostree is defined): True 34886 1727204485.15575: variable 'omit' from source: magic vars 34886 1727204485.15768: variable 'omit' from source: magic vars 34886 1727204485.16035: variable '__ostree_booted_stat' from source: set_fact 34886 1727204485.16154: variable 'omit' from source: magic vars 34886 1727204485.16397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204485.16401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204485.16409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204485.16537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204485.16540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204485.16580: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204485.16588: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.16660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.16994: Set connection var ansible_timeout to 10 34886 1727204485.16998: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204485.17053: Set connection var ansible_connection to ssh 34886 1727204485.17056: Set connection var ansible_shell_executable to /bin/sh 34886 1727204485.17059: Set connection var ansible_pipelining to False 34886 1727204485.17061: Set connection var ansible_shell_type to sh 34886 1727204485.17204: variable 'ansible_shell_executable' from source: unknown 34886 1727204485.17208: variable 'ansible_connection' from source: unknown 34886 1727204485.17210: variable 'ansible_module_compression' from source: unknown 34886 1727204485.17212: variable 'ansible_shell_type' from source: unknown 34886 1727204485.17214: variable 'ansible_shell_executable' from source: unknown 34886 1727204485.17216: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.17226: variable 'ansible_pipelining' from source: unknown 34886 1727204485.17233: variable 'ansible_timeout' from source: unknown 34886 1727204485.17242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.17624: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204485.17628: variable 'omit' from source: magic vars 34886 1727204485.17638: starting attempt loop 34886 1727204485.17732: running the handler 34886 1727204485.17735: handler run complete 34886 1727204485.17743: attempt loop complete, returning result 34886 1727204485.17746: _execute() done 34886 1727204485.17748: dumping result to json 34886 1727204485.17751: done dumping result, returning 34886 1727204485.17753: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [12b410aa-8751-04b9-2e74-0000000000cd] 34886 1727204485.17756: sending task result for task 12b410aa-8751-04b9-2e74-0000000000cd ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 34886 1727204485.18214: no more pending results, returning what we have 34886 1727204485.18218: results queue empty 34886 1727204485.18219: checking for any_errors_fatal 34886 1727204485.18225: done checking for any_errors_fatal 34886 1727204485.18226: checking for max_fail_percentage 34886 1727204485.18228: done checking for max_fail_percentage 34886 1727204485.18229: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.18230: done checking to see if all hosts have failed 34886 1727204485.18231: getting the remaining hosts for this loop 34886 1727204485.18233: done getting the remaining hosts for this loop 34886 1727204485.18238: getting the next task for host managed-node3 34886 1727204485.18312: done getting next task for host managed-node3 34886 1727204485.18316: ^ task is: TASK: Fix CentOS6 Base repo 34886 1727204485.18319: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.18324: getting variables 34886 1727204485.18326: in VariableManager get_vars() 34886 1727204485.18480: Calling all_inventory to load vars for managed-node3 34886 1727204485.18484: Calling groups_inventory to load vars for managed-node3 34886 1727204485.18525: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.18538: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.18543: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.18547: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.19348: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000cd 34886 1727204485.19358: WORKER PROCESS EXITING 34886 1727204485.19387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.20068: done with get_vars() 34886 1727204485.20081: done getting variables 34886 1727204485.20452: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.098) 0:00:03.372 ***** 34886 1727204485.20484: entering _queue_task() for managed-node3/copy 34886 1727204485.21134: worker is 1 (out of 1 available) 34886 1727204485.21146: exiting _queue_task() for managed-node3/copy 34886 1727204485.21158: done queuing things up, now waiting for results queue to drain 34886 1727204485.21160: waiting for pending results... 34886 1727204485.21571: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 34886 1727204485.21673: in run() - task 12b410aa-8751-04b9-2e74-0000000000cf 34886 1727204485.22094: variable 'ansible_search_path' from source: unknown 34886 1727204485.22098: variable 'ansible_search_path' from source: unknown 34886 1727204485.22101: calling self._execute() 34886 1727204485.22103: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.22106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.22108: variable 'omit' from source: magic vars 34886 1727204485.23229: variable 'ansible_distribution' from source: facts 34886 1727204485.23261: Evaluated conditional (ansible_distribution == 'CentOS'): False 34886 1727204485.23695: when evaluation is False, skipping this task 34886 1727204485.23699: _execute() done 34886 1727204485.23701: dumping result to json 34886 1727204485.23704: done dumping result, returning 34886 1727204485.23707: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [12b410aa-8751-04b9-2e74-0000000000cf] 34886 1727204485.23709: sending task result for task 12b410aa-8751-04b9-2e74-0000000000cf 34886 1727204485.23794: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000cf 34886 1727204485.23797: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 34886 1727204485.23878: no more pending results, returning what we have 34886 1727204485.23881: results queue empty 34886 1727204485.23882: checking for any_errors_fatal 34886 1727204485.23887: done checking for any_errors_fatal 34886 1727204485.23887: checking for max_fail_percentage 34886 1727204485.23891: done checking for max_fail_percentage 34886 1727204485.23892: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.23893: done checking to see if all hosts have failed 34886 1727204485.23894: getting the remaining hosts for this loop 34886 1727204485.23896: done getting the remaining hosts for this loop 34886 1727204485.23899: getting the next task for host managed-node3 34886 1727204485.23905: done getting next task for host managed-node3 34886 1727204485.23908: ^ task is: TASK: Include the task 'enable_epel.yml' 34886 1727204485.23911: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.23914: getting variables 34886 1727204485.23916: in VariableManager get_vars() 34886 1727204485.23945: Calling all_inventory to load vars for managed-node3 34886 1727204485.23949: Calling groups_inventory to load vars for managed-node3 34886 1727204485.23953: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.23964: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.23968: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.23972: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.24829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.25654: done with get_vars() 34886 1727204485.25667: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.054) 0:00:03.427 ***** 34886 1727204485.25979: entering _queue_task() for managed-node3/include_tasks 34886 1727204485.26727: worker is 1 (out of 1 available) 34886 1727204485.26858: exiting _queue_task() for managed-node3/include_tasks 34886 1727204485.26870: done queuing things up, now waiting for results queue to drain 34886 1727204485.26872: waiting for pending results... 34886 1727204485.27180: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 34886 1727204485.27525: in run() - task 12b410aa-8751-04b9-2e74-0000000000d0 34886 1727204485.27535: variable 'ansible_search_path' from source: unknown 34886 1727204485.27539: variable 'ansible_search_path' from source: unknown 34886 1727204485.27580: calling self._execute() 34886 1727204485.27794: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.27905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.27918: variable 'omit' from source: magic vars 34886 1727204485.29499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204485.34990: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204485.35193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204485.35277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204485.35381: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204485.35475: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204485.35702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204485.35798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204485.35831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204485.36000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204485.36023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204485.36344: variable '__network_is_ostree' from source: set_fact 34886 1727204485.36366: Evaluated conditional (not __network_is_ostree | d(false)): True 34886 1727204485.36373: _execute() done 34886 1727204485.36380: dumping result to json 34886 1727204485.36383: done dumping result, returning 34886 1727204485.36449: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-04b9-2e74-0000000000d0] 34886 1727204485.36453: sending task result for task 12b410aa-8751-04b9-2e74-0000000000d0 34886 1727204485.36725: no more pending results, returning what we have 34886 1727204485.36733: in VariableManager get_vars() 34886 1727204485.36774: Calling all_inventory to load vars for managed-node3 34886 1727204485.36778: Calling groups_inventory to load vars for managed-node3 34886 1727204485.36782: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.36799: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.36804: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.36809: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.37476: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000d0 34886 1727204485.37481: WORKER PROCESS EXITING 34886 1727204485.37606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.38331: done with get_vars() 34886 1727204485.38342: variable 'ansible_search_path' from source: unknown 34886 1727204485.38344: variable 'ansible_search_path' from source: unknown 34886 1727204485.38494: we have included files to process 34886 1727204485.38496: generating all_blocks data 34886 1727204485.38498: done generating all_blocks data 34886 1727204485.38509: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34886 1727204485.38511: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34886 1727204485.38515: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34886 1727204485.40339: done processing included file 34886 1727204485.40341: iterating over new_blocks loaded from include file 34886 1727204485.40343: in VariableManager get_vars() 34886 1727204485.40475: done with get_vars() 34886 1727204485.40478: filtering new block on tags 34886 1727204485.40510: done filtering new block on tags 34886 1727204485.40514: in VariableManager get_vars() 34886 1727204485.40529: done with get_vars() 34886 1727204485.40531: filtering new block on tags 34886 1727204485.40546: done filtering new block on tags 34886 1727204485.40549: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 34886 1727204485.40555: extending task lists for all hosts with included blocks 34886 1727204485.40811: done extending task lists 34886 1727204485.40813: done processing included files 34886 1727204485.40814: results queue empty 34886 1727204485.40815: checking for any_errors_fatal 34886 1727204485.40820: done checking for any_errors_fatal 34886 1727204485.40821: checking for max_fail_percentage 34886 1727204485.40822: done checking for max_fail_percentage 34886 1727204485.40823: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.40824: done checking to see if all hosts have failed 34886 1727204485.40825: getting the remaining hosts for this loop 34886 1727204485.40826: done getting the remaining hosts for this loop 34886 1727204485.40829: getting the next task for host managed-node3 34886 1727204485.40834: done getting next task for host managed-node3 34886 1727204485.40837: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 34886 1727204485.40840: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.40842: getting variables 34886 1727204485.40844: in VariableManager get_vars() 34886 1727204485.40853: Calling all_inventory to load vars for managed-node3 34886 1727204485.40856: Calling groups_inventory to load vars for managed-node3 34886 1727204485.40859: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.40865: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.40874: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.40878: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.41604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.42271: done with get_vars() 34886 1727204485.42285: done getting variables 34886 1727204485.42483: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34886 1727204485.43017: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.172) 0:00:03.599 ***** 34886 1727204485.43192: entering _queue_task() for managed-node3/command 34886 1727204485.43195: Creating lock for command 34886 1727204485.43865: worker is 1 (out of 1 available) 34886 1727204485.43879: exiting _queue_task() for managed-node3/command 34886 1727204485.43942: done queuing things up, now waiting for results queue to drain 34886 1727204485.43944: waiting for pending results... 34886 1727204485.44318: running TaskExecutor() for managed-node3/TASK: Create EPEL 39 34886 1727204485.44578: in run() - task 12b410aa-8751-04b9-2e74-0000000000ea 34886 1727204485.44717: variable 'ansible_search_path' from source: unknown 34886 1727204485.44724: variable 'ansible_search_path' from source: unknown 34886 1727204485.44747: calling self._execute() 34886 1727204485.44955: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.45003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.45152: variable 'omit' from source: magic vars 34886 1727204485.46155: variable 'ansible_distribution' from source: facts 34886 1727204485.46159: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34886 1727204485.46161: when evaluation is False, skipping this task 34886 1727204485.46164: _execute() done 34886 1727204485.46167: dumping result to json 34886 1727204485.46169: done dumping result, returning 34886 1727204485.46172: done running TaskExecutor() for managed-node3/TASK: Create EPEL 39 [12b410aa-8751-04b9-2e74-0000000000ea] 34886 1727204485.46175: sending task result for task 12b410aa-8751-04b9-2e74-0000000000ea 34886 1727204485.46490: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000ea skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34886 1727204485.46603: no more pending results, returning what we have 34886 1727204485.46609: results queue empty 34886 1727204485.46610: checking for any_errors_fatal 34886 1727204485.46612: done checking for any_errors_fatal 34886 1727204485.46613: checking for max_fail_percentage 34886 1727204485.46615: done checking for max_fail_percentage 34886 1727204485.46615: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.46617: done checking to see if all hosts have failed 34886 1727204485.46617: getting the remaining hosts for this loop 34886 1727204485.46619: done getting the remaining hosts for this loop 34886 1727204485.46624: getting the next task for host managed-node3 34886 1727204485.46631: done getting next task for host managed-node3 34886 1727204485.46634: ^ task is: TASK: Install yum-utils package 34886 1727204485.46639: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.46643: getting variables 34886 1727204485.46645: in VariableManager get_vars() 34886 1727204485.46715: Calling all_inventory to load vars for managed-node3 34886 1727204485.46719: Calling groups_inventory to load vars for managed-node3 34886 1727204485.46723: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.46730: WORKER PROCESS EXITING 34886 1727204485.46776: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.46781: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.46786: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.47358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.48138: done with get_vars() 34886 1727204485.48151: done getting variables 34886 1727204485.48373: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.052) 0:00:03.651 ***** 34886 1727204485.48410: entering _queue_task() for managed-node3/package 34886 1727204485.48412: Creating lock for package 34886 1727204485.48666: worker is 1 (out of 1 available) 34886 1727204485.48682: exiting _queue_task() for managed-node3/package 34886 1727204485.48698: done queuing things up, now waiting for results queue to drain 34886 1727204485.48700: waiting for pending results... 34886 1727204485.48867: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 34886 1727204485.48955: in run() - task 12b410aa-8751-04b9-2e74-0000000000eb 34886 1727204485.48968: variable 'ansible_search_path' from source: unknown 34886 1727204485.48972: variable 'ansible_search_path' from source: unknown 34886 1727204485.49004: calling self._execute() 34886 1727204485.49073: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.49080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.49091: variable 'omit' from source: magic vars 34886 1727204485.49416: variable 'ansible_distribution' from source: facts 34886 1727204485.49430: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34886 1727204485.49434: when evaluation is False, skipping this task 34886 1727204485.49436: _execute() done 34886 1727204485.49440: dumping result to json 34886 1727204485.49445: done dumping result, returning 34886 1727204485.49452: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [12b410aa-8751-04b9-2e74-0000000000eb] 34886 1727204485.49458: sending task result for task 12b410aa-8751-04b9-2e74-0000000000eb 34886 1727204485.49554: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000eb 34886 1727204485.49557: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34886 1727204485.49636: no more pending results, returning what we have 34886 1727204485.49640: results queue empty 34886 1727204485.49641: checking for any_errors_fatal 34886 1727204485.49647: done checking for any_errors_fatal 34886 1727204485.49648: checking for max_fail_percentage 34886 1727204485.49650: done checking for max_fail_percentage 34886 1727204485.49651: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.49652: done checking to see if all hosts have failed 34886 1727204485.49653: getting the remaining hosts for this loop 34886 1727204485.49654: done getting the remaining hosts for this loop 34886 1727204485.49658: getting the next task for host managed-node3 34886 1727204485.49663: done getting next task for host managed-node3 34886 1727204485.49666: ^ task is: TASK: Enable EPEL 7 34886 1727204485.49669: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.49673: getting variables 34886 1727204485.49674: in VariableManager get_vars() 34886 1727204485.49703: Calling all_inventory to load vars for managed-node3 34886 1727204485.49715: Calling groups_inventory to load vars for managed-node3 34886 1727204485.49719: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.49729: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.49733: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.49736: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.49891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.50092: done with get_vars() 34886 1727204485.50101: done getting variables 34886 1727204485.50166: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.018) 0:00:03.669 ***** 34886 1727204485.50218: entering _queue_task() for managed-node3/command 34886 1727204485.50500: worker is 1 (out of 1 available) 34886 1727204485.50513: exiting _queue_task() for managed-node3/command 34886 1727204485.50528: done queuing things up, now waiting for results queue to drain 34886 1727204485.50530: waiting for pending results... 34886 1727204485.51022: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 34886 1727204485.51028: in run() - task 12b410aa-8751-04b9-2e74-0000000000ec 34886 1727204485.51295: variable 'ansible_search_path' from source: unknown 34886 1727204485.51299: variable 'ansible_search_path' from source: unknown 34886 1727204485.51302: calling self._execute() 34886 1727204485.51305: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.51308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.51311: variable 'omit' from source: magic vars 34886 1727204485.52144: variable 'ansible_distribution' from source: facts 34886 1727204485.52155: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34886 1727204485.52161: when evaluation is False, skipping this task 34886 1727204485.52165: _execute() done 34886 1727204485.52170: dumping result to json 34886 1727204485.52173: done dumping result, returning 34886 1727204485.52180: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [12b410aa-8751-04b9-2e74-0000000000ec] 34886 1727204485.52187: sending task result for task 12b410aa-8751-04b9-2e74-0000000000ec 34886 1727204485.52297: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000ec 34886 1727204485.52302: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34886 1727204485.52357: no more pending results, returning what we have 34886 1727204485.52360: results queue empty 34886 1727204485.52361: checking for any_errors_fatal 34886 1727204485.52367: done checking for any_errors_fatal 34886 1727204485.52368: checking for max_fail_percentage 34886 1727204485.52370: done checking for max_fail_percentage 34886 1727204485.52371: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.52372: done checking to see if all hosts have failed 34886 1727204485.52373: getting the remaining hosts for this loop 34886 1727204485.52374: done getting the remaining hosts for this loop 34886 1727204485.52378: getting the next task for host managed-node3 34886 1727204485.52383: done getting next task for host managed-node3 34886 1727204485.52386: ^ task is: TASK: Enable EPEL 8 34886 1727204485.52431: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.52436: getting variables 34886 1727204485.52438: in VariableManager get_vars() 34886 1727204485.52467: Calling all_inventory to load vars for managed-node3 34886 1727204485.52470: Calling groups_inventory to load vars for managed-node3 34886 1727204485.52472: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.52480: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.52482: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.52484: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.52672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.52855: done with get_vars() 34886 1727204485.52864: done getting variables 34886 1727204485.52912: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.027) 0:00:03.697 ***** 34886 1727204485.52938: entering _queue_task() for managed-node3/command 34886 1727204485.53147: worker is 1 (out of 1 available) 34886 1727204485.53163: exiting _queue_task() for managed-node3/command 34886 1727204485.53175: done queuing things up, now waiting for results queue to drain 34886 1727204485.53177: waiting for pending results... 34886 1727204485.53335: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 34886 1727204485.53414: in run() - task 12b410aa-8751-04b9-2e74-0000000000ed 34886 1727204485.53430: variable 'ansible_search_path' from source: unknown 34886 1727204485.53435: variable 'ansible_search_path' from source: unknown 34886 1727204485.53463: calling self._execute() 34886 1727204485.53528: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.53533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.53544: variable 'omit' from source: magic vars 34886 1727204485.53850: variable 'ansible_distribution' from source: facts 34886 1727204485.53863: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34886 1727204485.53867: when evaluation is False, skipping this task 34886 1727204485.53870: _execute() done 34886 1727204485.53873: dumping result to json 34886 1727204485.53875: done dumping result, returning 34886 1727204485.53886: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [12b410aa-8751-04b9-2e74-0000000000ed] 34886 1727204485.53891: sending task result for task 12b410aa-8751-04b9-2e74-0000000000ed 34886 1727204485.53987: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000ed 34886 1727204485.53990: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34886 1727204485.54042: no more pending results, returning what we have 34886 1727204485.54046: results queue empty 34886 1727204485.54047: checking for any_errors_fatal 34886 1727204485.54051: done checking for any_errors_fatal 34886 1727204485.54052: checking for max_fail_percentage 34886 1727204485.54053: done checking for max_fail_percentage 34886 1727204485.54054: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.54055: done checking to see if all hosts have failed 34886 1727204485.54056: getting the remaining hosts for this loop 34886 1727204485.54057: done getting the remaining hosts for this loop 34886 1727204485.54061: getting the next task for host managed-node3 34886 1727204485.54069: done getting next task for host managed-node3 34886 1727204485.54072: ^ task is: TASK: Enable EPEL 6 34886 1727204485.54076: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.54079: getting variables 34886 1727204485.54080: in VariableManager get_vars() 34886 1727204485.54108: Calling all_inventory to load vars for managed-node3 34886 1727204485.54112: Calling groups_inventory to load vars for managed-node3 34886 1727204485.54114: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.54124: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.54127: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.54129: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.54277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.54460: done with get_vars() 34886 1727204485.54468: done getting variables 34886 1727204485.54514: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.015) 0:00:03.713 ***** 34886 1727204485.54539: entering _queue_task() for managed-node3/copy 34886 1727204485.54732: worker is 1 (out of 1 available) 34886 1727204485.54746: exiting _queue_task() for managed-node3/copy 34886 1727204485.54759: done queuing things up, now waiting for results queue to drain 34886 1727204485.54762: waiting for pending results... 34886 1727204485.54914: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 34886 1727204485.54986: in run() - task 12b410aa-8751-04b9-2e74-0000000000ef 34886 1727204485.55002: variable 'ansible_search_path' from source: unknown 34886 1727204485.55006: variable 'ansible_search_path' from source: unknown 34886 1727204485.55033: calling self._execute() 34886 1727204485.55091: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.55096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.55111: variable 'omit' from source: magic vars 34886 1727204485.55662: variable 'ansible_distribution' from source: facts 34886 1727204485.55665: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 34886 1727204485.55674: when evaluation is False, skipping this task 34886 1727204485.55677: _execute() done 34886 1727204485.55680: dumping result to json 34886 1727204485.55683: done dumping result, returning 34886 1727204485.55691: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [12b410aa-8751-04b9-2e74-0000000000ef] 34886 1727204485.55701: sending task result for task 12b410aa-8751-04b9-2e74-0000000000ef 34886 1727204485.55788: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000ef 34886 1727204485.55793: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 34886 1727204485.55850: no more pending results, returning what we have 34886 1727204485.55854: results queue empty 34886 1727204485.55855: checking for any_errors_fatal 34886 1727204485.55860: done checking for any_errors_fatal 34886 1727204485.55861: checking for max_fail_percentage 34886 1727204485.55863: done checking for max_fail_percentage 34886 1727204485.55864: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.55865: done checking to see if all hosts have failed 34886 1727204485.55865: getting the remaining hosts for this loop 34886 1727204485.55867: done getting the remaining hosts for this loop 34886 1727204485.55870: getting the next task for host managed-node3 34886 1727204485.55879: done getting next task for host managed-node3 34886 1727204485.55882: ^ task is: TASK: Set network provider to 'nm' 34886 1727204485.55884: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.55888: getting variables 34886 1727204485.55891: in VariableManager get_vars() 34886 1727204485.55917: Calling all_inventory to load vars for managed-node3 34886 1727204485.55923: Calling groups_inventory to load vars for managed-node3 34886 1727204485.55926: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.55938: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.55953: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.55958: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.56450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.56818: done with get_vars() 34886 1727204485.56828: done getting variables 34886 1727204485.56909: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.023) 0:00:03.737 ***** 34886 1727204485.56938: entering _queue_task() for managed-node3/set_fact 34886 1727204485.57197: worker is 1 (out of 1 available) 34886 1727204485.57210: exiting _queue_task() for managed-node3/set_fact 34886 1727204485.57223: done queuing things up, now waiting for results queue to drain 34886 1727204485.57225: waiting for pending results... 34886 1727204485.57815: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 34886 1727204485.57824: in run() - task 12b410aa-8751-04b9-2e74-000000000007 34886 1727204485.57828: variable 'ansible_search_path' from source: unknown 34886 1727204485.57876: calling self._execute() 34886 1727204485.58016: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.58023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.58027: variable 'omit' from source: magic vars 34886 1727204485.58138: variable 'omit' from source: magic vars 34886 1727204485.58195: variable 'omit' from source: magic vars 34886 1727204485.58241: variable 'omit' from source: magic vars 34886 1727204485.58300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204485.58396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204485.58400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204485.58403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204485.58415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204485.58450: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204485.58454: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.58457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.58609: Set connection var ansible_timeout to 10 34886 1727204485.58625: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204485.58628: Set connection var ansible_connection to ssh 34886 1727204485.58637: Set connection var ansible_shell_executable to /bin/sh 34886 1727204485.58648: Set connection var ansible_pipelining to False 34886 1727204485.58651: Set connection var ansible_shell_type to sh 34886 1727204485.58698: variable 'ansible_shell_executable' from source: unknown 34886 1727204485.58702: variable 'ansible_connection' from source: unknown 34886 1727204485.58705: variable 'ansible_module_compression' from source: unknown 34886 1727204485.58707: variable 'ansible_shell_type' from source: unknown 34886 1727204485.58709: variable 'ansible_shell_executable' from source: unknown 34886 1727204485.58712: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.58813: variable 'ansible_pipelining' from source: unknown 34886 1727204485.58816: variable 'ansible_timeout' from source: unknown 34886 1727204485.58821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.58924: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204485.58943: variable 'omit' from source: magic vars 34886 1727204485.58952: starting attempt loop 34886 1727204485.58955: running the handler 34886 1727204485.58973: handler run complete 34886 1727204485.58988: attempt loop complete, returning result 34886 1727204485.59010: _execute() done 34886 1727204485.59012: dumping result to json 34886 1727204485.59015: done dumping result, returning 34886 1727204485.59018: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [12b410aa-8751-04b9-2e74-000000000007] 34886 1727204485.59022: sending task result for task 12b410aa-8751-04b9-2e74-000000000007 34886 1727204485.59121: done sending task result for task 12b410aa-8751-04b9-2e74-000000000007 34886 1727204485.59125: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 34886 1727204485.59211: no more pending results, returning what we have 34886 1727204485.59214: results queue empty 34886 1727204485.59215: checking for any_errors_fatal 34886 1727204485.59224: done checking for any_errors_fatal 34886 1727204485.59225: checking for max_fail_percentage 34886 1727204485.59226: done checking for max_fail_percentage 34886 1727204485.59227: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.59228: done checking to see if all hosts have failed 34886 1727204485.59229: getting the remaining hosts for this loop 34886 1727204485.59230: done getting the remaining hosts for this loop 34886 1727204485.59234: getting the next task for host managed-node3 34886 1727204485.59241: done getting next task for host managed-node3 34886 1727204485.59243: ^ task is: TASK: meta (flush_handlers) 34886 1727204485.59245: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.59249: getting variables 34886 1727204485.59255: in VariableManager get_vars() 34886 1727204485.59279: Calling all_inventory to load vars for managed-node3 34886 1727204485.59281: Calling groups_inventory to load vars for managed-node3 34886 1727204485.59283: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.59292: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.59295: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.59297: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.59443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.59727: done with get_vars() 34886 1727204485.59738: done getting variables 34886 1727204485.59808: in VariableManager get_vars() 34886 1727204485.59818: Calling all_inventory to load vars for managed-node3 34886 1727204485.59824: Calling groups_inventory to load vars for managed-node3 34886 1727204485.59827: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.59832: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.59835: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.59839: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.60057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.60367: done with get_vars() 34886 1727204485.60383: done queuing things up, now waiting for results queue to drain 34886 1727204485.60385: results queue empty 34886 1727204485.60386: checking for any_errors_fatal 34886 1727204485.60388: done checking for any_errors_fatal 34886 1727204485.60391: checking for max_fail_percentage 34886 1727204485.60392: done checking for max_fail_percentage 34886 1727204485.60393: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.60394: done checking to see if all hosts have failed 34886 1727204485.60395: getting the remaining hosts for this loop 34886 1727204485.60396: done getting the remaining hosts for this loop 34886 1727204485.60399: getting the next task for host managed-node3 34886 1727204485.60403: done getting next task for host managed-node3 34886 1727204485.60405: ^ task is: TASK: meta (flush_handlers) 34886 1727204485.60407: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.60414: getting variables 34886 1727204485.60415: in VariableManager get_vars() 34886 1727204485.60427: Calling all_inventory to load vars for managed-node3 34886 1727204485.60430: Calling groups_inventory to load vars for managed-node3 34886 1727204485.60433: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.60440: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.60443: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.60451: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.60640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.61031: done with get_vars() 34886 1727204485.61042: done getting variables 34886 1727204485.61097: in VariableManager get_vars() 34886 1727204485.61107: Calling all_inventory to load vars for managed-node3 34886 1727204485.61110: Calling groups_inventory to load vars for managed-node3 34886 1727204485.61113: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.61118: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.61124: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.61128: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.61534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.62258: done with get_vars() 34886 1727204485.62272: done queuing things up, now waiting for results queue to drain 34886 1727204485.62274: results queue empty 34886 1727204485.62275: checking for any_errors_fatal 34886 1727204485.62276: done checking for any_errors_fatal 34886 1727204485.62277: checking for max_fail_percentage 34886 1727204485.62278: done checking for max_fail_percentage 34886 1727204485.62279: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.62280: done checking to see if all hosts have failed 34886 1727204485.62281: getting the remaining hosts for this loop 34886 1727204485.62282: done getting the remaining hosts for this loop 34886 1727204485.62284: getting the next task for host managed-node3 34886 1727204485.62288: done getting next task for host managed-node3 34886 1727204485.62291: ^ task is: None 34886 1727204485.62293: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.62294: done queuing things up, now waiting for results queue to drain 34886 1727204485.62295: results queue empty 34886 1727204485.62296: checking for any_errors_fatal 34886 1727204485.62297: done checking for any_errors_fatal 34886 1727204485.62298: checking for max_fail_percentage 34886 1727204485.62299: done checking for max_fail_percentage 34886 1727204485.62300: checking to see if all hosts have failed and the running result is not ok 34886 1727204485.62301: done checking to see if all hosts have failed 34886 1727204485.62303: getting the next task for host managed-node3 34886 1727204485.62305: done getting next task for host managed-node3 34886 1727204485.62306: ^ task is: None 34886 1727204485.62308: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.62409: in VariableManager get_vars() 34886 1727204485.62462: done with get_vars() 34886 1727204485.62471: in VariableManager get_vars() 34886 1727204485.62492: done with get_vars() 34886 1727204485.62498: variable 'omit' from source: magic vars 34886 1727204485.62538: in VariableManager get_vars() 34886 1727204485.62558: done with get_vars() 34886 1727204485.62599: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 34886 1727204485.63124: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34886 1727204485.63148: getting the remaining hosts for this loop 34886 1727204485.63150: done getting the remaining hosts for this loop 34886 1727204485.63153: getting the next task for host managed-node3 34886 1727204485.63156: done getting next task for host managed-node3 34886 1727204485.63158: ^ task is: TASK: Gathering Facts 34886 1727204485.63159: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204485.63161: getting variables 34886 1727204485.63161: in VariableManager get_vars() 34886 1727204485.63173: Calling all_inventory to load vars for managed-node3 34886 1727204485.63175: Calling groups_inventory to load vars for managed-node3 34886 1727204485.63176: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204485.63180: Calling all_plugins_play to load vars for managed-node3 34886 1727204485.63194: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204485.63197: Calling groups_plugins_play to load vars for managed-node3 34886 1727204485.63435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204485.63734: done with get_vars() 34886 1727204485.63743: done getting variables 34886 1727204485.63792: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.068) 0:00:03.805 ***** 34886 1727204485.63821: entering _queue_task() for managed-node3/gather_facts 34886 1727204485.64124: worker is 1 (out of 1 available) 34886 1727204485.64136: exiting _queue_task() for managed-node3/gather_facts 34886 1727204485.64149: done queuing things up, now waiting for results queue to drain 34886 1727204485.64151: waiting for pending results... 34886 1727204485.64733: running TaskExecutor() for managed-node3/TASK: Gathering Facts 34886 1727204485.64739: in run() - task 12b410aa-8751-04b9-2e74-000000000115 34886 1727204485.64757: variable 'ansible_search_path' from source: unknown 34886 1727204485.64822: calling self._execute() 34886 1727204485.64935: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.64950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.64971: variable 'omit' from source: magic vars 34886 1727204485.65442: variable 'ansible_distribution_major_version' from source: facts 34886 1727204485.65470: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204485.65479: variable 'omit' from source: magic vars 34886 1727204485.65520: variable 'omit' from source: magic vars 34886 1727204485.65558: variable 'omit' from source: magic vars 34886 1727204485.65598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204485.65634: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204485.65652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204485.65669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204485.65688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204485.65737: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204485.65740: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.65743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.65825: Set connection var ansible_timeout to 10 34886 1727204485.65830: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204485.65833: Set connection var ansible_connection to ssh 34886 1727204485.65844: Set connection var ansible_shell_executable to /bin/sh 34886 1727204485.65851: Set connection var ansible_pipelining to False 34886 1727204485.65855: Set connection var ansible_shell_type to sh 34886 1727204485.65877: variable 'ansible_shell_executable' from source: unknown 34886 1727204485.65880: variable 'ansible_connection' from source: unknown 34886 1727204485.65883: variable 'ansible_module_compression' from source: unknown 34886 1727204485.65887: variable 'ansible_shell_type' from source: unknown 34886 1727204485.65893: variable 'ansible_shell_executable' from source: unknown 34886 1727204485.65896: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204485.65908: variable 'ansible_pipelining' from source: unknown 34886 1727204485.65911: variable 'ansible_timeout' from source: unknown 34886 1727204485.65914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204485.66068: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204485.66080: variable 'omit' from source: magic vars 34886 1727204485.66086: starting attempt loop 34886 1727204485.66091: running the handler 34886 1727204485.66106: variable 'ansible_facts' from source: unknown 34886 1727204485.66130: _low_level_execute_command(): starting 34886 1727204485.66133: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204485.66692: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204485.66696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.66722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.66795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204485.66815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204485.66902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204485.69269: stdout chunk (state=3): >>>/root <<< 34886 1727204485.69430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204485.69488: stderr chunk (state=3): >>><<< 34886 1727204485.69494: stdout chunk (state=3): >>><<< 34886 1727204485.69518: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204485.69536: _low_level_execute_command(): starting 34886 1727204485.69542: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955 `" && echo ansible-tmp-1727204485.6951947-35336-55389128165955="` echo /root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955 `" ) && sleep 0' 34886 1727204485.70318: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.70340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204485.70353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204485.70367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204485.70444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204485.73331: stdout chunk (state=3): >>>ansible-tmp-1727204485.6951947-35336-55389128165955=/root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955 <<< 34886 1727204485.73529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204485.73629: stderr chunk (state=3): >>><<< 34886 1727204485.73632: stdout chunk (state=3): >>><<< 34886 1727204485.73647: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204485.6951947-35336-55389128165955=/root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204485.73697: variable 'ansible_module_compression' from source: unknown 34886 1727204485.73760: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34886 1727204485.73899: variable 'ansible_facts' from source: unknown 34886 1727204485.74013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/AnsiballZ_setup.py 34886 1727204485.74208: Sending initial data 34886 1727204485.74213: Sent initial data (153 bytes) 34886 1727204485.74742: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204485.74745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.74748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204485.74750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204485.74759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.74810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204485.74831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204485.74881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204485.77176: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34886 1727204485.77185: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204485.77214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204485.77254: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp_q6fk3bd /root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/AnsiballZ_setup.py <<< 34886 1727204485.77267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/AnsiballZ_setup.py" <<< 34886 1727204485.77294: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp_q6fk3bd" to remote "/root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/AnsiballZ_setup.py" <<< 34886 1727204485.79275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204485.79374: stderr chunk (state=3): >>><<< 34886 1727204485.79377: stdout chunk (state=3): >>><<< 34886 1727204485.79397: done transferring module to remote 34886 1727204485.79412: _low_level_execute_command(): starting 34886 1727204485.79415: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/ /root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/AnsiballZ_setup.py && sleep 0' 34886 1727204485.80142: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204485.80146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204485.80166: stderr chunk (state=3): >>>debug2: match found <<< 34886 1727204485.80169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.80238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204485.80241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204485.80284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204485.82980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204485.83028: stderr chunk (state=3): >>><<< 34886 1727204485.83052: stdout chunk (state=3): >>><<< 34886 1727204485.83072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204485.83076: _low_level_execute_command(): starting 34886 1727204485.83083: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/AnsiballZ_setup.py && sleep 0' 34886 1727204485.83663: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204485.83674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.83680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204485.83683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204485.83748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204485.83798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204486.75363: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "26", "epoch": "1727204486", "epoch_int": "1727204486", "date": "2024-09-24", "time": "15:01:26", "iso8601_micro": "2024-09-24T19:01:26.240979Z", "iso8601": "2024-09-24T19:01:26Z", "iso8601_basic": "20240924T150126240979", "iso8601_basic_short": "20240924T150126", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.72802734375, "5m": 0.5869140625, "15m": 0.3798828125}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2856, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 861, "free": 2856}, "nocache": {"free": 3476, "used": 241}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 990, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251152879616, "block_size": 4096, "block_total": 64479564, "block_available": 61316621, "block_used": 3162943, "inode_total": 16384000, "inode_available": 16302322, "inode_used": 81678, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34886 1727204486.78065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204486.78124: stderr chunk (state=3): >>><<< 34886 1727204486.78129: stdout chunk (state=3): >>><<< 34886 1727204486.78160: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "01", "second": "26", "epoch": "1727204486", "epoch_int": "1727204486", "date": "2024-09-24", "time": "15:01:26", "iso8601_micro": "2024-09-24T19:01:26.240979Z", "iso8601": "2024-09-24T19:01:26Z", "iso8601_basic": "20240924T150126240979", "iso8601_basic_short": "20240924T150126", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.72802734375, "5m": 0.5869140625, "15m": 0.3798828125}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2856, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 861, "free": 2856}, "nocache": {"free": 3476, "used": 241}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 990, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251152879616, "block_size": 4096, "block_total": 64479564, "block_available": 61316621, "block_used": 3162943, "inode_total": 16384000, "inode_available": 16302322, "inode_used": 81678, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204486.78685: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204486.78800: _low_level_execute_command(): starting 34886 1727204486.78804: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204485.6951947-35336-55389128165955/ > /dev/null 2>&1 && sleep 0' 34886 1727204486.79361: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204486.79365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204486.79367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204486.79430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204486.79451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204486.79514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204486.82026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204486.82071: stderr chunk (state=3): >>><<< 34886 1727204486.82074: stdout chunk (state=3): >>><<< 34886 1727204486.82097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204486.82105: handler run complete 34886 1727204486.82235: variable 'ansible_facts' from source: unknown 34886 1727204486.82322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.82716: variable 'ansible_facts' from source: unknown 34886 1727204486.82894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.83056: attempt loop complete, returning result 34886 1727204486.83068: _execute() done 34886 1727204486.83075: dumping result to json 34886 1727204486.83125: done dumping result, returning 34886 1727204486.83141: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-04b9-2e74-000000000115] 34886 1727204486.83153: sending task result for task 12b410aa-8751-04b9-2e74-000000000115 ok: [managed-node3] 34886 1727204486.84051: no more pending results, returning what we have 34886 1727204486.84054: results queue empty 34886 1727204486.84056: checking for any_errors_fatal 34886 1727204486.84057: done checking for any_errors_fatal 34886 1727204486.84058: checking for max_fail_percentage 34886 1727204486.84060: done checking for max_fail_percentage 34886 1727204486.84061: checking to see if all hosts have failed and the running result is not ok 34886 1727204486.84061: done checking to see if all hosts have failed 34886 1727204486.84062: getting the remaining hosts for this loop 34886 1727204486.84064: done getting the remaining hosts for this loop 34886 1727204486.84067: getting the next task for host managed-node3 34886 1727204486.84073: done getting next task for host managed-node3 34886 1727204486.84075: ^ task is: TASK: meta (flush_handlers) 34886 1727204486.84078: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204486.84081: getting variables 34886 1727204486.84083: in VariableManager get_vars() 34886 1727204486.84132: Calling all_inventory to load vars for managed-node3 34886 1727204486.84136: Calling groups_inventory to load vars for managed-node3 34886 1727204486.84140: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204486.84153: Calling all_plugins_play to load vars for managed-node3 34886 1727204486.84157: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204486.84162: Calling groups_plugins_play to load vars for managed-node3 34886 1727204486.84423: done sending task result for task 12b410aa-8751-04b9-2e74-000000000115 34886 1727204486.84427: WORKER PROCESS EXITING 34886 1727204486.84444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.84840: done with get_vars() 34886 1727204486.84855: done getting variables 34886 1727204486.84930: in VariableManager get_vars() 34886 1727204486.84942: Calling all_inventory to load vars for managed-node3 34886 1727204486.84944: Calling groups_inventory to load vars for managed-node3 34886 1727204486.84946: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204486.84949: Calling all_plugins_play to load vars for managed-node3 34886 1727204486.84951: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204486.84953: Calling groups_plugins_play to load vars for managed-node3 34886 1727204486.85087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.85270: done with get_vars() 34886 1727204486.85282: done queuing things up, now waiting for results queue to drain 34886 1727204486.85284: results queue empty 34886 1727204486.85284: checking for any_errors_fatal 34886 1727204486.85287: done checking for any_errors_fatal 34886 1727204486.85288: checking for max_fail_percentage 34886 1727204486.85288: done checking for max_fail_percentage 34886 1727204486.85291: checking to see if all hosts have failed and the running result is not ok 34886 1727204486.85292: done checking to see if all hosts have failed 34886 1727204486.85295: getting the remaining hosts for this loop 34886 1727204486.85296: done getting the remaining hosts for this loop 34886 1727204486.85298: getting the next task for host managed-node3 34886 1727204486.85301: done getting next task for host managed-node3 34886 1727204486.85303: ^ task is: TASK: Include the task 'show_interfaces.yml' 34886 1727204486.85304: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204486.85306: getting variables 34886 1727204486.85307: in VariableManager get_vars() 34886 1727204486.85322: Calling all_inventory to load vars for managed-node3 34886 1727204486.85325: Calling groups_inventory to load vars for managed-node3 34886 1727204486.85327: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204486.85331: Calling all_plugins_play to load vars for managed-node3 34886 1727204486.85333: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204486.85335: Calling groups_plugins_play to load vars for managed-node3 34886 1727204486.85458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.85637: done with get_vars() 34886 1727204486.85645: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Tuesday 24 September 2024 15:01:26 -0400 (0:00:01.218) 0:00:05.024 ***** 34886 1727204486.85703: entering _queue_task() for managed-node3/include_tasks 34886 1727204486.85939: worker is 1 (out of 1 available) 34886 1727204486.85954: exiting _queue_task() for managed-node3/include_tasks 34886 1727204486.85967: done queuing things up, now waiting for results queue to drain 34886 1727204486.85969: waiting for pending results... 34886 1727204486.86130: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 34886 1727204486.86202: in run() - task 12b410aa-8751-04b9-2e74-00000000000b 34886 1727204486.86213: variable 'ansible_search_path' from source: unknown 34886 1727204486.86246: calling self._execute() 34886 1727204486.86323: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204486.86328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204486.86334: variable 'omit' from source: magic vars 34886 1727204486.86708: variable 'ansible_distribution_major_version' from source: facts 34886 1727204486.86722: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204486.86726: _execute() done 34886 1727204486.86729: dumping result to json 34886 1727204486.86732: done dumping result, returning 34886 1727204486.86741: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-04b9-2e74-00000000000b] 34886 1727204486.86752: sending task result for task 12b410aa-8751-04b9-2e74-00000000000b 34886 1727204486.86844: done sending task result for task 12b410aa-8751-04b9-2e74-00000000000b 34886 1727204486.86847: WORKER PROCESS EXITING 34886 1727204486.86884: no more pending results, returning what we have 34886 1727204486.86890: in VariableManager get_vars() 34886 1727204486.86935: Calling all_inventory to load vars for managed-node3 34886 1727204486.86938: Calling groups_inventory to load vars for managed-node3 34886 1727204486.86941: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204486.86951: Calling all_plugins_play to load vars for managed-node3 34886 1727204486.86955: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204486.86958: Calling groups_plugins_play to load vars for managed-node3 34886 1727204486.87181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.87537: done with get_vars() 34886 1727204486.87543: variable 'ansible_search_path' from source: unknown 34886 1727204486.87560: we have included files to process 34886 1727204486.87562: generating all_blocks data 34886 1727204486.87563: done generating all_blocks data 34886 1727204486.87565: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204486.87566: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204486.87574: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204486.87809: in VariableManager get_vars() 34886 1727204486.87828: done with get_vars() 34886 1727204486.88036: done processing included file 34886 1727204486.88038: iterating over new_blocks loaded from include file 34886 1727204486.88040: in VariableManager get_vars() 34886 1727204486.88060: done with get_vars() 34886 1727204486.88062: filtering new block on tags 34886 1727204486.88075: done filtering new block on tags 34886 1727204486.88077: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 34886 1727204486.88093: extending task lists for all hosts with included blocks 34886 1727204486.88200: done extending task lists 34886 1727204486.88201: done processing included files 34886 1727204486.88202: results queue empty 34886 1727204486.88203: checking for any_errors_fatal 34886 1727204486.88204: done checking for any_errors_fatal 34886 1727204486.88204: checking for max_fail_percentage 34886 1727204486.88205: done checking for max_fail_percentage 34886 1727204486.88206: checking to see if all hosts have failed and the running result is not ok 34886 1727204486.88206: done checking to see if all hosts have failed 34886 1727204486.88207: getting the remaining hosts for this loop 34886 1727204486.88208: done getting the remaining hosts for this loop 34886 1727204486.88210: getting the next task for host managed-node3 34886 1727204486.88212: done getting next task for host managed-node3 34886 1727204486.88214: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 34886 1727204486.88216: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204486.88217: getting variables 34886 1727204486.88218: in VariableManager get_vars() 34886 1727204486.88230: Calling all_inventory to load vars for managed-node3 34886 1727204486.88232: Calling groups_inventory to load vars for managed-node3 34886 1727204486.88234: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204486.88237: Calling all_plugins_play to load vars for managed-node3 34886 1727204486.88239: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204486.88241: Calling groups_plugins_play to load vars for managed-node3 34886 1727204486.88386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.88569: done with get_vars() 34886 1727204486.88590: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:01:26 -0400 (0:00:00.029) 0:00:05.054 ***** 34886 1727204486.88696: entering _queue_task() for managed-node3/include_tasks 34886 1727204486.89049: worker is 1 (out of 1 available) 34886 1727204486.89069: exiting _queue_task() for managed-node3/include_tasks 34886 1727204486.89091: done queuing things up, now waiting for results queue to drain 34886 1727204486.89093: waiting for pending results... 34886 1727204486.89394: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 34886 1727204486.89562: in run() - task 12b410aa-8751-04b9-2e74-00000000012b 34886 1727204486.89601: variable 'ansible_search_path' from source: unknown 34886 1727204486.89604: variable 'ansible_search_path' from source: unknown 34886 1727204486.89684: calling self._execute() 34886 1727204486.89817: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204486.89838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204486.89896: variable 'omit' from source: magic vars 34886 1727204486.90433: variable 'ansible_distribution_major_version' from source: facts 34886 1727204486.90437: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204486.90440: _execute() done 34886 1727204486.90442: dumping result to json 34886 1727204486.90448: done dumping result, returning 34886 1727204486.90451: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-04b9-2e74-00000000012b] 34886 1727204486.90454: sending task result for task 12b410aa-8751-04b9-2e74-00000000012b 34886 1727204486.90644: no more pending results, returning what we have 34886 1727204486.90652: in VariableManager get_vars() 34886 1727204486.90704: Calling all_inventory to load vars for managed-node3 34886 1727204486.90709: Calling groups_inventory to load vars for managed-node3 34886 1727204486.90716: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204486.90731: Calling all_plugins_play to load vars for managed-node3 34886 1727204486.90735: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204486.90745: Calling groups_plugins_play to load vars for managed-node3 34886 1727204486.90988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.91267: done with get_vars() 34886 1727204486.91276: variable 'ansible_search_path' from source: unknown 34886 1727204486.91277: variable 'ansible_search_path' from source: unknown 34886 1727204486.91296: done sending task result for task 12b410aa-8751-04b9-2e74-00000000012b 34886 1727204486.91300: WORKER PROCESS EXITING 34886 1727204486.91330: we have included files to process 34886 1727204486.91331: generating all_blocks data 34886 1727204486.91335: done generating all_blocks data 34886 1727204486.91336: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204486.91338: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204486.91342: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204486.91661: done processing included file 34886 1727204486.91663: iterating over new_blocks loaded from include file 34886 1727204486.91664: in VariableManager get_vars() 34886 1727204486.91678: done with get_vars() 34886 1727204486.91679: filtering new block on tags 34886 1727204486.91694: done filtering new block on tags 34886 1727204486.91696: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 34886 1727204486.91699: extending task lists for all hosts with included blocks 34886 1727204486.91781: done extending task lists 34886 1727204486.91782: done processing included files 34886 1727204486.91783: results queue empty 34886 1727204486.91784: checking for any_errors_fatal 34886 1727204486.91788: done checking for any_errors_fatal 34886 1727204486.91790: checking for max_fail_percentage 34886 1727204486.91792: done checking for max_fail_percentage 34886 1727204486.91793: checking to see if all hosts have failed and the running result is not ok 34886 1727204486.91794: done checking to see if all hosts have failed 34886 1727204486.91795: getting the remaining hosts for this loop 34886 1727204486.91796: done getting the remaining hosts for this loop 34886 1727204486.91801: getting the next task for host managed-node3 34886 1727204486.91807: done getting next task for host managed-node3 34886 1727204486.91810: ^ task is: TASK: Gather current interface info 34886 1727204486.91814: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204486.91816: getting variables 34886 1727204486.91818: in VariableManager get_vars() 34886 1727204486.91833: Calling all_inventory to load vars for managed-node3 34886 1727204486.91836: Calling groups_inventory to load vars for managed-node3 34886 1727204486.91838: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204486.91843: Calling all_plugins_play to load vars for managed-node3 34886 1727204486.91845: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204486.91847: Calling groups_plugins_play to load vars for managed-node3 34886 1727204486.92016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204486.92233: done with get_vars() 34886 1727204486.92244: done getting variables 34886 1727204486.92279: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:01:26 -0400 (0:00:00.036) 0:00:05.090 ***** 34886 1727204486.92307: entering _queue_task() for managed-node3/command 34886 1727204486.92537: worker is 1 (out of 1 available) 34886 1727204486.92554: exiting _queue_task() for managed-node3/command 34886 1727204486.92568: done queuing things up, now waiting for results queue to drain 34886 1727204486.92570: waiting for pending results... 34886 1727204486.92773: running TaskExecutor() for managed-node3/TASK: Gather current interface info 34886 1727204486.92856: in run() - task 12b410aa-8751-04b9-2e74-00000000013a 34886 1727204486.92868: variable 'ansible_search_path' from source: unknown 34886 1727204486.92871: variable 'ansible_search_path' from source: unknown 34886 1727204486.92909: calling self._execute() 34886 1727204486.92981: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204486.92987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204486.92999: variable 'omit' from source: magic vars 34886 1727204486.93314: variable 'ansible_distribution_major_version' from source: facts 34886 1727204486.93327: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204486.93334: variable 'omit' from source: magic vars 34886 1727204486.93376: variable 'omit' from source: magic vars 34886 1727204486.93406: variable 'omit' from source: magic vars 34886 1727204486.93443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204486.93478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204486.93499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204486.93515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204486.93529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204486.93558: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204486.93562: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204486.93566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204486.93653: Set connection var ansible_timeout to 10 34886 1727204486.93659: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204486.93662: Set connection var ansible_connection to ssh 34886 1727204486.93669: Set connection var ansible_shell_executable to /bin/sh 34886 1727204486.93683: Set connection var ansible_pipelining to False 34886 1727204486.93687: Set connection var ansible_shell_type to sh 34886 1727204486.93708: variable 'ansible_shell_executable' from source: unknown 34886 1727204486.93711: variable 'ansible_connection' from source: unknown 34886 1727204486.93714: variable 'ansible_module_compression' from source: unknown 34886 1727204486.93719: variable 'ansible_shell_type' from source: unknown 34886 1727204486.93725: variable 'ansible_shell_executable' from source: unknown 34886 1727204486.93729: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204486.93734: variable 'ansible_pipelining' from source: unknown 34886 1727204486.93737: variable 'ansible_timeout' from source: unknown 34886 1727204486.93742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204486.93861: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204486.93871: variable 'omit' from source: magic vars 34886 1727204486.93878: starting attempt loop 34886 1727204486.93881: running the handler 34886 1727204486.93901: _low_level_execute_command(): starting 34886 1727204486.93912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204486.94486: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204486.94493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204486.94496: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204486.94499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204486.94543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204486.94546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204486.94609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204486.97071: stdout chunk (state=3): >>>/root <<< 34886 1727204486.97235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204486.97313: stderr chunk (state=3): >>><<< 34886 1727204486.97316: stdout chunk (state=3): >>><<< 34886 1727204486.97404: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204486.97408: _low_level_execute_command(): starting 34886 1727204486.97410: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379 `" && echo ansible-tmp-1727204486.973573-35379-193071691643379="` echo /root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379 `" ) && sleep 0' 34886 1727204486.97999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204486.98003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204486.98005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204486.98018: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204486.98041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204486.98073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204486.98076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204486.98175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.00949: stdout chunk (state=3): >>>ansible-tmp-1727204486.973573-35379-193071691643379=/root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379 <<< 34886 1727204487.01118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.01172: stderr chunk (state=3): >>><<< 34886 1727204487.01175: stdout chunk (state=3): >>><<< 34886 1727204487.01197: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204486.973573-35379-193071691643379=/root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204487.01231: variable 'ansible_module_compression' from source: unknown 34886 1727204487.01275: ANSIBALLZ: Using generic lock for ansible.legacy.command 34886 1727204487.01278: ANSIBALLZ: Acquiring lock 34886 1727204487.01281: ANSIBALLZ: Lock acquired: 139734986903328 34886 1727204487.01287: ANSIBALLZ: Creating module 34886 1727204487.13801: ANSIBALLZ: Writing module into payload 34886 1727204487.13887: ANSIBALLZ: Writing module 34886 1727204487.13906: ANSIBALLZ: Renaming module 34886 1727204487.13912: ANSIBALLZ: Done creating module 34886 1727204487.13928: variable 'ansible_facts' from source: unknown 34886 1727204487.13982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/AnsiballZ_command.py 34886 1727204487.14116: Sending initial data 34886 1727204487.14123: Sent initial data (155 bytes) 34886 1727204487.14614: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204487.14617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204487.14623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204487.14626: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204487.14628: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.14683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204487.14688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.14736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.17068: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204487.17112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204487.17155: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp6g88xi1k /root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/AnsiballZ_command.py <<< 34886 1727204487.17158: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/AnsiballZ_command.py" <<< 34886 1727204487.17210: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp6g88xi1k" to remote "/root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/AnsiballZ_command.py" <<< 34886 1727204487.18079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.18186: stderr chunk (state=3): >>><<< 34886 1727204487.18192: stdout chunk (state=3): >>><<< 34886 1727204487.18246: done transferring module to remote 34886 1727204487.18249: _low_level_execute_command(): starting 34886 1727204487.18252: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/ /root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/AnsiballZ_command.py && sleep 0' 34886 1727204487.18862: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204487.18866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.18870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204487.18873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.18949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204487.18957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.18987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.21640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.21730: stderr chunk (state=3): >>><<< 34886 1727204487.21733: stdout chunk (state=3): >>><<< 34886 1727204487.21750: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204487.21754: _low_level_execute_command(): starting 34886 1727204487.21760: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/AnsiballZ_command.py && sleep 0' 34886 1727204487.22245: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204487.22250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.22253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204487.22255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.22324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204487.22345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.22444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.50266: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:27.495929", "end": "2024-09-24 15:01:27.501027", "delta": "0:00:00.005098", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 34886 1727204487.50298: stdout chunk (state=3): >>> <<< 34886 1727204487.52749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.52787: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 34886 1727204487.52792: stdout chunk (state=3): >>><<< 34886 1727204487.52795: stderr chunk (state=3): >>><<< 34886 1727204487.52814: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:27.495929", "end": "2024-09-24 15:01:27.501027", "delta": "0:00:00.005098", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204487.52867: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204487.52900: _low_level_execute_command(): starting 34886 1727204487.52987: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204486.973573-35379-193071691643379/ > /dev/null 2>&1 && sleep 0' 34886 1727204487.53585: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204487.53628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204487.53632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204487.53635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204487.53638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.53640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204487.53646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204487.53658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.53726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204487.53753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.53810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.56699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.56703: stdout chunk (state=3): >>><<< 34886 1727204487.56706: stderr chunk (state=3): >>><<< 34886 1727204487.56710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204487.56713: handler run complete 34886 1727204487.56715: Evaluated conditional (False): False 34886 1727204487.56794: attempt loop complete, returning result 34886 1727204487.56798: _execute() done 34886 1727204487.56802: dumping result to json 34886 1727204487.56804: done dumping result, returning 34886 1727204487.56807: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [12b410aa-8751-04b9-2e74-00000000013a] 34886 1727204487.56809: sending task result for task 12b410aa-8751-04b9-2e74-00000000013a ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005098", "end": "2024-09-24 15:01:27.501027", "rc": 0, "start": "2024-09-24 15:01:27.495929" } STDOUT: bonding_masters eth0 lo 34886 1727204487.57038: no more pending results, returning what we have 34886 1727204487.57042: results queue empty 34886 1727204487.57043: checking for any_errors_fatal 34886 1727204487.57045: done checking for any_errors_fatal 34886 1727204487.57046: checking for max_fail_percentage 34886 1727204487.57049: done checking for max_fail_percentage 34886 1727204487.57050: checking to see if all hosts have failed and the running result is not ok 34886 1727204487.57051: done checking to see if all hosts have failed 34886 1727204487.57052: getting the remaining hosts for this loop 34886 1727204487.57054: done getting the remaining hosts for this loop 34886 1727204487.57059: getting the next task for host managed-node3 34886 1727204487.57067: done getting next task for host managed-node3 34886 1727204487.57070: ^ task is: TASK: Set current_interfaces 34886 1727204487.57075: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204487.57080: getting variables 34886 1727204487.57082: in VariableManager get_vars() 34886 1727204487.57334: Calling all_inventory to load vars for managed-node3 34886 1727204487.57375: Calling groups_inventory to load vars for managed-node3 34886 1727204487.57380: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.57395: done sending task result for task 12b410aa-8751-04b9-2e74-00000000013a 34886 1727204487.57398: WORKER PROCESS EXITING 34886 1727204487.57412: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.57416: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.57428: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.57697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.57916: done with get_vars() 34886 1727204487.57927: done getting variables 34886 1727204487.57979: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.657) 0:00:05.747 ***** 34886 1727204487.58009: entering _queue_task() for managed-node3/set_fact 34886 1727204487.58227: worker is 1 (out of 1 available) 34886 1727204487.58240: exiting _queue_task() for managed-node3/set_fact 34886 1727204487.58255: done queuing things up, now waiting for results queue to drain 34886 1727204487.58257: waiting for pending results... 34886 1727204487.58419: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 34886 1727204487.58508: in run() - task 12b410aa-8751-04b9-2e74-00000000013b 34886 1727204487.58521: variable 'ansible_search_path' from source: unknown 34886 1727204487.58527: variable 'ansible_search_path' from source: unknown 34886 1727204487.58559: calling self._execute() 34886 1727204487.58634: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.58641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.58651: variable 'omit' from source: magic vars 34886 1727204487.58968: variable 'ansible_distribution_major_version' from source: facts 34886 1727204487.58980: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204487.58986: variable 'omit' from source: magic vars 34886 1727204487.59025: variable 'omit' from source: magic vars 34886 1727204487.59116: variable '_current_interfaces' from source: set_fact 34886 1727204487.59174: variable 'omit' from source: magic vars 34886 1727204487.59211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204487.59244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204487.59265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204487.59282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204487.59363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204487.59368: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204487.59372: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.59375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.59421: Set connection var ansible_timeout to 10 34886 1727204487.59430: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204487.59433: Set connection var ansible_connection to ssh 34886 1727204487.59440: Set connection var ansible_shell_executable to /bin/sh 34886 1727204487.59449: Set connection var ansible_pipelining to False 34886 1727204487.59452: Set connection var ansible_shell_type to sh 34886 1727204487.59474: variable 'ansible_shell_executable' from source: unknown 34886 1727204487.59477: variable 'ansible_connection' from source: unknown 34886 1727204487.59480: variable 'ansible_module_compression' from source: unknown 34886 1727204487.59486: variable 'ansible_shell_type' from source: unknown 34886 1727204487.59489: variable 'ansible_shell_executable' from source: unknown 34886 1727204487.59500: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.59503: variable 'ansible_pipelining' from source: unknown 34886 1727204487.59505: variable 'ansible_timeout' from source: unknown 34886 1727204487.59507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.59633: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204487.59643: variable 'omit' from source: magic vars 34886 1727204487.59650: starting attempt loop 34886 1727204487.59653: running the handler 34886 1727204487.59664: handler run complete 34886 1727204487.59674: attempt loop complete, returning result 34886 1727204487.59677: _execute() done 34886 1727204487.59679: dumping result to json 34886 1727204487.59687: done dumping result, returning 34886 1727204487.59696: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [12b410aa-8751-04b9-2e74-00000000013b] 34886 1727204487.59701: sending task result for task 12b410aa-8751-04b9-2e74-00000000013b 34886 1727204487.59787: done sending task result for task 12b410aa-8751-04b9-2e74-00000000013b 34886 1727204487.59793: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 34886 1727204487.59859: no more pending results, returning what we have 34886 1727204487.59863: results queue empty 34886 1727204487.59864: checking for any_errors_fatal 34886 1727204487.59872: done checking for any_errors_fatal 34886 1727204487.59873: checking for max_fail_percentage 34886 1727204487.59875: done checking for max_fail_percentage 34886 1727204487.59876: checking to see if all hosts have failed and the running result is not ok 34886 1727204487.59877: done checking to see if all hosts have failed 34886 1727204487.59878: getting the remaining hosts for this loop 34886 1727204487.59879: done getting the remaining hosts for this loop 34886 1727204487.59883: getting the next task for host managed-node3 34886 1727204487.59892: done getting next task for host managed-node3 34886 1727204487.59895: ^ task is: TASK: Show current_interfaces 34886 1727204487.59898: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204487.59902: getting variables 34886 1727204487.59904: in VariableManager get_vars() 34886 1727204487.59944: Calling all_inventory to load vars for managed-node3 34886 1727204487.59948: Calling groups_inventory to load vars for managed-node3 34886 1727204487.59951: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.59961: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.59964: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.59968: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.60236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.60560: done with get_vars() 34886 1727204487.60571: done getting variables 34886 1727204487.60680: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.027) 0:00:05.774 ***** 34886 1727204487.60722: entering _queue_task() for managed-node3/debug 34886 1727204487.60725: Creating lock for debug 34886 1727204487.61008: worker is 1 (out of 1 available) 34886 1727204487.61023: exiting _queue_task() for managed-node3/debug 34886 1727204487.61036: done queuing things up, now waiting for results queue to drain 34886 1727204487.61038: waiting for pending results... 34886 1727204487.61314: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 34886 1727204487.61369: in run() - task 12b410aa-8751-04b9-2e74-00000000012c 34886 1727204487.61409: variable 'ansible_search_path' from source: unknown 34886 1727204487.61422: variable 'ansible_search_path' from source: unknown 34886 1727204487.61558: calling self._execute() 34886 1727204487.61569: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.61576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.61594: variable 'omit' from source: magic vars 34886 1727204487.62194: variable 'ansible_distribution_major_version' from source: facts 34886 1727204487.62199: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204487.62203: variable 'omit' from source: magic vars 34886 1727204487.62206: variable 'omit' from source: magic vars 34886 1727204487.62222: variable 'current_interfaces' from source: set_fact 34886 1727204487.62263: variable 'omit' from source: magic vars 34886 1727204487.62316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204487.62370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204487.62404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204487.62432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204487.62456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204487.62501: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204487.62586: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.62591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.62661: Set connection var ansible_timeout to 10 34886 1727204487.62894: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204487.62897: Set connection var ansible_connection to ssh 34886 1727204487.62901: Set connection var ansible_shell_executable to /bin/sh 34886 1727204487.62904: Set connection var ansible_pipelining to False 34886 1727204487.62906: Set connection var ansible_shell_type to sh 34886 1727204487.62908: variable 'ansible_shell_executable' from source: unknown 34886 1727204487.62913: variable 'ansible_connection' from source: unknown 34886 1727204487.62915: variable 'ansible_module_compression' from source: unknown 34886 1727204487.62917: variable 'ansible_shell_type' from source: unknown 34886 1727204487.62919: variable 'ansible_shell_executable' from source: unknown 34886 1727204487.62921: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.62923: variable 'ansible_pipelining' from source: unknown 34886 1727204487.62925: variable 'ansible_timeout' from source: unknown 34886 1727204487.62927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.63011: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204487.63063: variable 'omit' from source: magic vars 34886 1727204487.63066: starting attempt loop 34886 1727204487.63068: running the handler 34886 1727204487.63198: handler run complete 34886 1727204487.63201: attempt loop complete, returning result 34886 1727204487.63204: _execute() done 34886 1727204487.63206: dumping result to json 34886 1727204487.63208: done dumping result, returning 34886 1727204487.63211: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [12b410aa-8751-04b9-2e74-00000000012c] 34886 1727204487.63215: sending task result for task 12b410aa-8751-04b9-2e74-00000000012c ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 34886 1727204487.63401: no more pending results, returning what we have 34886 1727204487.63405: results queue empty 34886 1727204487.63406: checking for any_errors_fatal 34886 1727204487.63411: done checking for any_errors_fatal 34886 1727204487.63412: checking for max_fail_percentage 34886 1727204487.63414: done checking for max_fail_percentage 34886 1727204487.63414: checking to see if all hosts have failed and the running result is not ok 34886 1727204487.63416: done checking to see if all hosts have failed 34886 1727204487.63416: getting the remaining hosts for this loop 34886 1727204487.63418: done getting the remaining hosts for this loop 34886 1727204487.63425: getting the next task for host managed-node3 34886 1727204487.63433: done getting next task for host managed-node3 34886 1727204487.63436: ^ task is: TASK: Include the task 'manage_test_interface.yml' 34886 1727204487.63438: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204487.63442: getting variables 34886 1727204487.63456: in VariableManager get_vars() 34886 1727204487.63508: Calling all_inventory to load vars for managed-node3 34886 1727204487.63512: Calling groups_inventory to load vars for managed-node3 34886 1727204487.63518: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.63531: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.63536: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.63543: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.63839: done sending task result for task 12b410aa-8751-04b9-2e74-00000000012c 34886 1727204487.63844: WORKER PROCESS EXITING 34886 1727204487.63871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.64134: done with get_vars() 34886 1727204487.64146: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.035) 0:00:05.810 ***** 34886 1727204487.64236: entering _queue_task() for managed-node3/include_tasks 34886 1727204487.64448: worker is 1 (out of 1 available) 34886 1727204487.64465: exiting _queue_task() for managed-node3/include_tasks 34886 1727204487.64478: done queuing things up, now waiting for results queue to drain 34886 1727204487.64480: waiting for pending results... 34886 1727204487.64729: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 34886 1727204487.64773: in run() - task 12b410aa-8751-04b9-2e74-00000000000c 34886 1727204487.64787: variable 'ansible_search_path' from source: unknown 34886 1727204487.64820: calling self._execute() 34886 1727204487.64947: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.64950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.64953: variable 'omit' from source: magic vars 34886 1727204487.65394: variable 'ansible_distribution_major_version' from source: facts 34886 1727204487.65397: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204487.65400: _execute() done 34886 1727204487.65403: dumping result to json 34886 1727204487.65406: done dumping result, returning 34886 1727204487.65424: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [12b410aa-8751-04b9-2e74-00000000000c] 34886 1727204487.65427: sending task result for task 12b410aa-8751-04b9-2e74-00000000000c 34886 1727204487.65525: done sending task result for task 12b410aa-8751-04b9-2e74-00000000000c 34886 1727204487.65528: WORKER PROCESS EXITING 34886 1727204487.65557: no more pending results, returning what we have 34886 1727204487.65562: in VariableManager get_vars() 34886 1727204487.65604: Calling all_inventory to load vars for managed-node3 34886 1727204487.65607: Calling groups_inventory to load vars for managed-node3 34886 1727204487.65609: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.65622: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.65626: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.65630: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.65786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.65967: done with get_vars() 34886 1727204487.65974: variable 'ansible_search_path' from source: unknown 34886 1727204487.65984: we have included files to process 34886 1727204487.65984: generating all_blocks data 34886 1727204487.65986: done generating all_blocks data 34886 1727204487.65991: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34886 1727204487.65992: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34886 1727204487.65994: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34886 1727204487.66435: in VariableManager get_vars() 34886 1727204487.66452: done with get_vars() 34886 1727204487.66633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 34886 1727204487.67134: done processing included file 34886 1727204487.67135: iterating over new_blocks loaded from include file 34886 1727204487.67136: in VariableManager get_vars() 34886 1727204487.67152: done with get_vars() 34886 1727204487.67153: filtering new block on tags 34886 1727204487.67180: done filtering new block on tags 34886 1727204487.67182: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 34886 1727204487.67343: extending task lists for all hosts with included blocks 34886 1727204487.67471: done extending task lists 34886 1727204487.67473: done processing included files 34886 1727204487.67473: results queue empty 34886 1727204487.67474: checking for any_errors_fatal 34886 1727204487.67476: done checking for any_errors_fatal 34886 1727204487.67476: checking for max_fail_percentage 34886 1727204487.67477: done checking for max_fail_percentage 34886 1727204487.67478: checking to see if all hosts have failed and the running result is not ok 34886 1727204487.67478: done checking to see if all hosts have failed 34886 1727204487.67479: getting the remaining hosts for this loop 34886 1727204487.67480: done getting the remaining hosts for this loop 34886 1727204487.67482: getting the next task for host managed-node3 34886 1727204487.67484: done getting next task for host managed-node3 34886 1727204487.67486: ^ task is: TASK: Ensure state in ["present", "absent"] 34886 1727204487.67487: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204487.67491: getting variables 34886 1727204487.67492: in VariableManager get_vars() 34886 1727204487.67504: Calling all_inventory to load vars for managed-node3 34886 1727204487.67505: Calling groups_inventory to load vars for managed-node3 34886 1727204487.67507: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.67512: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.67513: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.67516: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.67642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.67818: done with get_vars() 34886 1727204487.67829: done getting variables 34886 1727204487.67879: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.036) 0:00:05.846 ***** 34886 1727204487.67905: entering _queue_task() for managed-node3/fail 34886 1727204487.67907: Creating lock for fail 34886 1727204487.68149: worker is 1 (out of 1 available) 34886 1727204487.68163: exiting _queue_task() for managed-node3/fail 34886 1727204487.68179: done queuing things up, now waiting for results queue to drain 34886 1727204487.68181: waiting for pending results... 34886 1727204487.68454: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 34886 1727204487.68550: in run() - task 12b410aa-8751-04b9-2e74-000000000156 34886 1727204487.68558: variable 'ansible_search_path' from source: unknown 34886 1727204487.68562: variable 'ansible_search_path' from source: unknown 34886 1727204487.68661: calling self._execute() 34886 1727204487.68700: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.68752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.68756: variable 'omit' from source: magic vars 34886 1727204487.69234: variable 'ansible_distribution_major_version' from source: facts 34886 1727204487.69238: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204487.69454: variable 'state' from source: include params 34886 1727204487.69459: Evaluated conditional (state not in ["present", "absent"]): False 34886 1727204487.69462: when evaluation is False, skipping this task 34886 1727204487.69474: _execute() done 34886 1727204487.69492: dumping result to json 34886 1727204487.69495: done dumping result, returning 34886 1727204487.69503: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [12b410aa-8751-04b9-2e74-000000000156] 34886 1727204487.69510: sending task result for task 12b410aa-8751-04b9-2e74-000000000156 34886 1727204487.69678: done sending task result for task 12b410aa-8751-04b9-2e74-000000000156 34886 1727204487.69729: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 34886 1727204487.69784: no more pending results, returning what we have 34886 1727204487.69787: results queue empty 34886 1727204487.69788: checking for any_errors_fatal 34886 1727204487.69792: done checking for any_errors_fatal 34886 1727204487.69795: checking for max_fail_percentage 34886 1727204487.69796: done checking for max_fail_percentage 34886 1727204487.69797: checking to see if all hosts have failed and the running result is not ok 34886 1727204487.69798: done checking to see if all hosts have failed 34886 1727204487.69799: getting the remaining hosts for this loop 34886 1727204487.69800: done getting the remaining hosts for this loop 34886 1727204487.69805: getting the next task for host managed-node3 34886 1727204487.69815: done getting next task for host managed-node3 34886 1727204487.69821: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 34886 1727204487.69824: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204487.69828: getting variables 34886 1727204487.69829: in VariableManager get_vars() 34886 1727204487.69884: Calling all_inventory to load vars for managed-node3 34886 1727204487.69888: Calling groups_inventory to load vars for managed-node3 34886 1727204487.69892: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.69903: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.69908: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.69912: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.70148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.70400: done with get_vars() 34886 1727204487.70409: done getting variables 34886 1727204487.70474: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.025) 0:00:05.872 ***** 34886 1727204487.70499: entering _queue_task() for managed-node3/fail 34886 1727204487.70830: worker is 1 (out of 1 available) 34886 1727204487.70845: exiting _queue_task() for managed-node3/fail 34886 1727204487.70859: done queuing things up, now waiting for results queue to drain 34886 1727204487.70861: waiting for pending results... 34886 1727204487.71173: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 34886 1727204487.71178: in run() - task 12b410aa-8751-04b9-2e74-000000000157 34886 1727204487.71181: variable 'ansible_search_path' from source: unknown 34886 1727204487.71184: variable 'ansible_search_path' from source: unknown 34886 1727204487.71208: calling self._execute() 34886 1727204487.71292: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.71299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.71309: variable 'omit' from source: magic vars 34886 1727204487.71660: variable 'ansible_distribution_major_version' from source: facts 34886 1727204487.71671: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204487.71810: variable 'type' from source: play vars 34886 1727204487.71823: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 34886 1727204487.71828: when evaluation is False, skipping this task 34886 1727204487.71831: _execute() done 34886 1727204487.71834: dumping result to json 34886 1727204487.71837: done dumping result, returning 34886 1727204487.71840: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [12b410aa-8751-04b9-2e74-000000000157] 34886 1727204487.71897: sending task result for task 12b410aa-8751-04b9-2e74-000000000157 34886 1727204487.71970: done sending task result for task 12b410aa-8751-04b9-2e74-000000000157 34886 1727204487.71973: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 34886 1727204487.72015: no more pending results, returning what we have 34886 1727204487.72018: results queue empty 34886 1727204487.72022: checking for any_errors_fatal 34886 1727204487.72029: done checking for any_errors_fatal 34886 1727204487.72030: checking for max_fail_percentage 34886 1727204487.72031: done checking for max_fail_percentage 34886 1727204487.72032: checking to see if all hosts have failed and the running result is not ok 34886 1727204487.72033: done checking to see if all hosts have failed 34886 1727204487.72034: getting the remaining hosts for this loop 34886 1727204487.72035: done getting the remaining hosts for this loop 34886 1727204487.72039: getting the next task for host managed-node3 34886 1727204487.72044: done getting next task for host managed-node3 34886 1727204487.72047: ^ task is: TASK: Include the task 'show_interfaces.yml' 34886 1727204487.72050: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204487.72053: getting variables 34886 1727204487.72054: in VariableManager get_vars() 34886 1727204487.72081: Calling all_inventory to load vars for managed-node3 34886 1727204487.72083: Calling groups_inventory to load vars for managed-node3 34886 1727204487.72085: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.72094: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.72096: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.72099: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.72251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.72450: done with get_vars() 34886 1727204487.72458: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.020) 0:00:05.893 ***** 34886 1727204487.72534: entering _queue_task() for managed-node3/include_tasks 34886 1727204487.72724: worker is 1 (out of 1 available) 34886 1727204487.72737: exiting _queue_task() for managed-node3/include_tasks 34886 1727204487.72752: done queuing things up, now waiting for results queue to drain 34886 1727204487.72754: waiting for pending results... 34886 1727204487.72900: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 34886 1727204487.72969: in run() - task 12b410aa-8751-04b9-2e74-000000000158 34886 1727204487.72984: variable 'ansible_search_path' from source: unknown 34886 1727204487.72990: variable 'ansible_search_path' from source: unknown 34886 1727204487.73023: calling self._execute() 34886 1727204487.73153: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.73158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.73161: variable 'omit' from source: magic vars 34886 1727204487.73578: variable 'ansible_distribution_major_version' from source: facts 34886 1727204487.73583: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204487.73586: _execute() done 34886 1727204487.73591: dumping result to json 34886 1727204487.73594: done dumping result, returning 34886 1727204487.73600: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-04b9-2e74-000000000158] 34886 1727204487.73607: sending task result for task 12b410aa-8751-04b9-2e74-000000000158 34886 1727204487.73701: done sending task result for task 12b410aa-8751-04b9-2e74-000000000158 34886 1727204487.73704: WORKER PROCESS EXITING 34886 1727204487.73753: no more pending results, returning what we have 34886 1727204487.73757: in VariableManager get_vars() 34886 1727204487.73810: Calling all_inventory to load vars for managed-node3 34886 1727204487.73812: Calling groups_inventory to load vars for managed-node3 34886 1727204487.73814: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.73827: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.73829: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.73832: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.74063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.74244: done with get_vars() 34886 1727204487.74251: variable 'ansible_search_path' from source: unknown 34886 1727204487.74252: variable 'ansible_search_path' from source: unknown 34886 1727204487.74279: we have included files to process 34886 1727204487.74280: generating all_blocks data 34886 1727204487.74281: done generating all_blocks data 34886 1727204487.74285: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204487.74286: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204487.74287: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204487.74371: in VariableManager get_vars() 34886 1727204487.74391: done with get_vars() 34886 1727204487.74493: done processing included file 34886 1727204487.74495: iterating over new_blocks loaded from include file 34886 1727204487.74496: in VariableManager get_vars() 34886 1727204487.74511: done with get_vars() 34886 1727204487.74512: filtering new block on tags 34886 1727204487.74527: done filtering new block on tags 34886 1727204487.74529: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 34886 1727204487.74532: extending task lists for all hosts with included blocks 34886 1727204487.74848: done extending task lists 34886 1727204487.74849: done processing included files 34886 1727204487.74850: results queue empty 34886 1727204487.74851: checking for any_errors_fatal 34886 1727204487.74853: done checking for any_errors_fatal 34886 1727204487.74853: checking for max_fail_percentage 34886 1727204487.74854: done checking for max_fail_percentage 34886 1727204487.74855: checking to see if all hosts have failed and the running result is not ok 34886 1727204487.74855: done checking to see if all hosts have failed 34886 1727204487.74856: getting the remaining hosts for this loop 34886 1727204487.74857: done getting the remaining hosts for this loop 34886 1727204487.74858: getting the next task for host managed-node3 34886 1727204487.74861: done getting next task for host managed-node3 34886 1727204487.74863: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 34886 1727204487.74865: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204487.74867: getting variables 34886 1727204487.74867: in VariableManager get_vars() 34886 1727204487.74877: Calling all_inventory to load vars for managed-node3 34886 1727204487.74879: Calling groups_inventory to load vars for managed-node3 34886 1727204487.74881: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.74886: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.74888: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.74893: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.75053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.75305: done with get_vars() 34886 1727204487.75317: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.028) 0:00:05.921 ***** 34886 1727204487.75386: entering _queue_task() for managed-node3/include_tasks 34886 1727204487.75636: worker is 1 (out of 1 available) 34886 1727204487.75651: exiting _queue_task() for managed-node3/include_tasks 34886 1727204487.75664: done queuing things up, now waiting for results queue to drain 34886 1727204487.75666: waiting for pending results... 34886 1727204487.75845: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 34886 1727204487.75957: in run() - task 12b410aa-8751-04b9-2e74-00000000017f 34886 1727204487.75962: variable 'ansible_search_path' from source: unknown 34886 1727204487.75965: variable 'ansible_search_path' from source: unknown 34886 1727204487.75987: calling self._execute() 34886 1727204487.76061: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.76068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.76077: variable 'omit' from source: magic vars 34886 1727204487.76448: variable 'ansible_distribution_major_version' from source: facts 34886 1727204487.76461: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204487.76464: _execute() done 34886 1727204487.76471: dumping result to json 34886 1727204487.76474: done dumping result, returning 34886 1727204487.76482: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-04b9-2e74-00000000017f] 34886 1727204487.76487: sending task result for task 12b410aa-8751-04b9-2e74-00000000017f 34886 1727204487.76643: done sending task result for task 12b410aa-8751-04b9-2e74-00000000017f 34886 1727204487.76646: WORKER PROCESS EXITING 34886 1727204487.76676: no more pending results, returning what we have 34886 1727204487.76680: in VariableManager get_vars() 34886 1727204487.76730: Calling all_inventory to load vars for managed-node3 34886 1727204487.76734: Calling groups_inventory to load vars for managed-node3 34886 1727204487.76737: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.76747: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.76749: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.76753: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.76965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.77230: done with get_vars() 34886 1727204487.77236: variable 'ansible_search_path' from source: unknown 34886 1727204487.77238: variable 'ansible_search_path' from source: unknown 34886 1727204487.77303: we have included files to process 34886 1727204487.77304: generating all_blocks data 34886 1727204487.77305: done generating all_blocks data 34886 1727204487.77306: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204487.77307: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204487.77308: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204487.77610: done processing included file 34886 1727204487.77613: iterating over new_blocks loaded from include file 34886 1727204487.77614: in VariableManager get_vars() 34886 1727204487.77632: done with get_vars() 34886 1727204487.77633: filtering new block on tags 34886 1727204487.77647: done filtering new block on tags 34886 1727204487.77649: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 34886 1727204487.77652: extending task lists for all hosts with included blocks 34886 1727204487.77869: done extending task lists 34886 1727204487.77875: done processing included files 34886 1727204487.77877: results queue empty 34886 1727204487.77877: checking for any_errors_fatal 34886 1727204487.77884: done checking for any_errors_fatal 34886 1727204487.77885: checking for max_fail_percentage 34886 1727204487.77887: done checking for max_fail_percentage 34886 1727204487.77887: checking to see if all hosts have failed and the running result is not ok 34886 1727204487.77888: done checking to see if all hosts have failed 34886 1727204487.77891: getting the remaining hosts for this loop 34886 1727204487.77895: done getting the remaining hosts for this loop 34886 1727204487.77899: getting the next task for host managed-node3 34886 1727204487.77905: done getting next task for host managed-node3 34886 1727204487.77907: ^ task is: TASK: Gather current interface info 34886 1727204487.77917: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204487.77924: getting variables 34886 1727204487.77925: in VariableManager get_vars() 34886 1727204487.77940: Calling all_inventory to load vars for managed-node3 34886 1727204487.77943: Calling groups_inventory to load vars for managed-node3 34886 1727204487.77946: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204487.77953: Calling all_plugins_play to load vars for managed-node3 34886 1727204487.77955: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204487.77957: Calling groups_plugins_play to load vars for managed-node3 34886 1727204487.78164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204487.78430: done with get_vars() 34886 1727204487.78438: done getting variables 34886 1727204487.78479: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.031) 0:00:05.952 ***** 34886 1727204487.78511: entering _queue_task() for managed-node3/command 34886 1727204487.78714: worker is 1 (out of 1 available) 34886 1727204487.78730: exiting _queue_task() for managed-node3/command 34886 1727204487.78744: done queuing things up, now waiting for results queue to drain 34886 1727204487.78746: waiting for pending results... 34886 1727204487.78918: running TaskExecutor() for managed-node3/TASK: Gather current interface info 34886 1727204487.79009: in run() - task 12b410aa-8751-04b9-2e74-0000000001b6 34886 1727204487.79025: variable 'ansible_search_path' from source: unknown 34886 1727204487.79030: variable 'ansible_search_path' from source: unknown 34886 1727204487.79057: calling self._execute() 34886 1727204487.79128: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.79135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.79145: variable 'omit' from source: magic vars 34886 1727204487.79530: variable 'ansible_distribution_major_version' from source: facts 34886 1727204487.79542: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204487.79549: variable 'omit' from source: magic vars 34886 1727204487.79599: variable 'omit' from source: magic vars 34886 1727204487.79629: variable 'omit' from source: magic vars 34886 1727204487.79666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204487.79895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204487.79900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204487.79903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204487.79908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204487.79911: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204487.79913: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.79917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.80008: Set connection var ansible_timeout to 10 34886 1727204487.80026: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204487.80037: Set connection var ansible_connection to ssh 34886 1727204487.80054: Set connection var ansible_shell_executable to /bin/sh 34886 1727204487.80075: Set connection var ansible_pipelining to False 34886 1727204487.80087: Set connection var ansible_shell_type to sh 34886 1727204487.80127: variable 'ansible_shell_executable' from source: unknown 34886 1727204487.80138: variable 'ansible_connection' from source: unknown 34886 1727204487.80153: variable 'ansible_module_compression' from source: unknown 34886 1727204487.80163: variable 'ansible_shell_type' from source: unknown 34886 1727204487.80172: variable 'ansible_shell_executable' from source: unknown 34886 1727204487.80181: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204487.80193: variable 'ansible_pipelining' from source: unknown 34886 1727204487.80196: variable 'ansible_timeout' from source: unknown 34886 1727204487.80201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204487.80341: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204487.80355: variable 'omit' from source: magic vars 34886 1727204487.80358: starting attempt loop 34886 1727204487.80361: running the handler 34886 1727204487.80364: _low_level_execute_command(): starting 34886 1727204487.80428: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204487.81025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204487.81029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.81034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204487.81037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.81086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204487.81100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204487.81106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.81178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.83539: stdout chunk (state=3): >>>/root <<< 34886 1727204487.83721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.83767: stderr chunk (state=3): >>><<< 34886 1727204487.83770: stdout chunk (state=3): >>><<< 34886 1727204487.83795: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204487.83809: _low_level_execute_command(): starting 34886 1727204487.83816: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713 `" && echo ansible-tmp-1727204487.8379498-35409-256840517386713="` echo /root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713 `" ) && sleep 0' 34886 1727204487.84426: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.84450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.87347: stdout chunk (state=3): >>>ansible-tmp-1727204487.8379498-35409-256840517386713=/root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713 <<< 34886 1727204487.87614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.87617: stdout chunk (state=3): >>><<< 34886 1727204487.87623: stderr chunk (state=3): >>><<< 34886 1727204487.87901: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204487.8379498-35409-256840517386713=/root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204487.87904: variable 'ansible_module_compression' from source: unknown 34886 1727204487.87907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204487.87909: variable 'ansible_facts' from source: unknown 34886 1727204487.88020: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/AnsiballZ_command.py 34886 1727204487.88220: Sending initial data 34886 1727204487.88224: Sent initial data (156 bytes) 34886 1727204487.88775: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204487.88894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204487.88911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.88984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.91454: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204487.91504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204487.91549: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpob6lc8n3 /root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/AnsiballZ_command.py <<< 34886 1727204487.91565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/AnsiballZ_command.py" <<< 34886 1727204487.91600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpob6lc8n3" to remote "/root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/AnsiballZ_command.py" <<< 34886 1727204487.92794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.92857: stderr chunk (state=3): >>><<< 34886 1727204487.92871: stdout chunk (state=3): >>><<< 34886 1727204487.92918: done transferring module to remote 34886 1727204487.92936: _low_level_execute_command(): starting 34886 1727204487.92947: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/ /root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/AnsiballZ_command.py && sleep 0' 34886 1727204487.93693: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.93710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.93798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204487.93851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.93907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204487.96677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204487.96680: stdout chunk (state=3): >>><<< 34886 1727204487.96683: stderr chunk (state=3): >>><<< 34886 1727204487.96686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204487.96695: _low_level_execute_command(): starting 34886 1727204487.96697: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/AnsiballZ_command.py && sleep 0' 34886 1727204487.97242: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204487.97268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204487.97285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204487.97318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204487.97408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204488.25107: stdout chunk (state=3): >>> <<< 34886 1727204488.25263: stdout chunk (state=3): >>>{"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:28.244352", "end": "2024-09-24 15:01:28.249734", "delta": "0:00:00.005382", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204488.27530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204488.27587: stderr chunk (state=3): >>><<< 34886 1727204488.27593: stdout chunk (state=3): >>><<< 34886 1727204488.27609: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:28.244352", "end": "2024-09-24 15:01:28.249734", "delta": "0:00:00.005382", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204488.27650: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204488.27659: _low_level_execute_command(): starting 34886 1727204488.27665: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204487.8379498-35409-256840517386713/ > /dev/null 2>&1 && sleep 0' 34886 1727204488.28125: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204488.28129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.28131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204488.28134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.28192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204488.28200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204488.28241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204488.30890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204488.30935: stderr chunk (state=3): >>><<< 34886 1727204488.30939: stdout chunk (state=3): >>><<< 34886 1727204488.30953: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204488.30962: handler run complete 34886 1727204488.30987: Evaluated conditional (False): False 34886 1727204488.31001: attempt loop complete, returning result 34886 1727204488.31004: _execute() done 34886 1727204488.31007: dumping result to json 34886 1727204488.31014: done dumping result, returning 34886 1727204488.31027: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [12b410aa-8751-04b9-2e74-0000000001b6] 34886 1727204488.31038: sending task result for task 12b410aa-8751-04b9-2e74-0000000001b6 34886 1727204488.31150: done sending task result for task 12b410aa-8751-04b9-2e74-0000000001b6 34886 1727204488.31153: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005382", "end": "2024-09-24 15:01:28.249734", "rc": 0, "start": "2024-09-24 15:01:28.244352" } STDOUT: bonding_masters eth0 lo 34886 1727204488.31251: no more pending results, returning what we have 34886 1727204488.31255: results queue empty 34886 1727204488.31257: checking for any_errors_fatal 34886 1727204488.31258: done checking for any_errors_fatal 34886 1727204488.31259: checking for max_fail_percentage 34886 1727204488.31261: done checking for max_fail_percentage 34886 1727204488.31266: checking to see if all hosts have failed and the running result is not ok 34886 1727204488.31268: done checking to see if all hosts have failed 34886 1727204488.31268: getting the remaining hosts for this loop 34886 1727204488.31270: done getting the remaining hosts for this loop 34886 1727204488.31275: getting the next task for host managed-node3 34886 1727204488.31283: done getting next task for host managed-node3 34886 1727204488.31285: ^ task is: TASK: Set current_interfaces 34886 1727204488.31293: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204488.31297: getting variables 34886 1727204488.31298: in VariableManager get_vars() 34886 1727204488.31340: Calling all_inventory to load vars for managed-node3 34886 1727204488.31344: Calling groups_inventory to load vars for managed-node3 34886 1727204488.31346: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204488.31357: Calling all_plugins_play to load vars for managed-node3 34886 1727204488.31360: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204488.31364: Calling groups_plugins_play to load vars for managed-node3 34886 1727204488.31564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204488.31755: done with get_vars() 34886 1727204488.31764: done getting variables 34886 1727204488.31816: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:01:28 -0400 (0:00:00.533) 0:00:06.486 ***** 34886 1727204488.31844: entering _queue_task() for managed-node3/set_fact 34886 1727204488.32061: worker is 1 (out of 1 available) 34886 1727204488.32075: exiting _queue_task() for managed-node3/set_fact 34886 1727204488.32092: done queuing things up, now waiting for results queue to drain 34886 1727204488.32094: waiting for pending results... 34886 1727204488.32253: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 34886 1727204488.32343: in run() - task 12b410aa-8751-04b9-2e74-0000000001b7 34886 1727204488.32357: variable 'ansible_search_path' from source: unknown 34886 1727204488.32361: variable 'ansible_search_path' from source: unknown 34886 1727204488.32394: calling self._execute() 34886 1727204488.32462: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.32469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.32479: variable 'omit' from source: magic vars 34886 1727204488.32794: variable 'ansible_distribution_major_version' from source: facts 34886 1727204488.32806: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204488.32812: variable 'omit' from source: magic vars 34886 1727204488.32857: variable 'omit' from source: magic vars 34886 1727204488.32945: variable '_current_interfaces' from source: set_fact 34886 1727204488.33002: variable 'omit' from source: magic vars 34886 1727204488.33036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204488.33065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204488.33086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204488.33103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204488.33118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204488.33148: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204488.33152: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.33157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.33256: Set connection var ansible_timeout to 10 34886 1727204488.33262: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204488.33265: Set connection var ansible_connection to ssh 34886 1727204488.33273: Set connection var ansible_shell_executable to /bin/sh 34886 1727204488.33281: Set connection var ansible_pipelining to False 34886 1727204488.33284: Set connection var ansible_shell_type to sh 34886 1727204488.33309: variable 'ansible_shell_executable' from source: unknown 34886 1727204488.33313: variable 'ansible_connection' from source: unknown 34886 1727204488.33317: variable 'ansible_module_compression' from source: unknown 34886 1727204488.33320: variable 'ansible_shell_type' from source: unknown 34886 1727204488.33327: variable 'ansible_shell_executable' from source: unknown 34886 1727204488.33330: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.33335: variable 'ansible_pipelining' from source: unknown 34886 1727204488.33338: variable 'ansible_timeout' from source: unknown 34886 1727204488.33344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.33465: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204488.33476: variable 'omit' from source: magic vars 34886 1727204488.33482: starting attempt loop 34886 1727204488.33485: running the handler 34886 1727204488.33499: handler run complete 34886 1727204488.33510: attempt loop complete, returning result 34886 1727204488.33513: _execute() done 34886 1727204488.33516: dumping result to json 34886 1727204488.33518: done dumping result, returning 34886 1727204488.33533: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [12b410aa-8751-04b9-2e74-0000000001b7] 34886 1727204488.33539: sending task result for task 12b410aa-8751-04b9-2e74-0000000001b7 34886 1727204488.33625: done sending task result for task 12b410aa-8751-04b9-2e74-0000000001b7 34886 1727204488.33636: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 34886 1727204488.33700: no more pending results, returning what we have 34886 1727204488.33703: results queue empty 34886 1727204488.33704: checking for any_errors_fatal 34886 1727204488.33711: done checking for any_errors_fatal 34886 1727204488.33712: checking for max_fail_percentage 34886 1727204488.33713: done checking for max_fail_percentage 34886 1727204488.33714: checking to see if all hosts have failed and the running result is not ok 34886 1727204488.33715: done checking to see if all hosts have failed 34886 1727204488.33716: getting the remaining hosts for this loop 34886 1727204488.33717: done getting the remaining hosts for this loop 34886 1727204488.33721: getting the next task for host managed-node3 34886 1727204488.33729: done getting next task for host managed-node3 34886 1727204488.33732: ^ task is: TASK: Show current_interfaces 34886 1727204488.33736: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204488.33741: getting variables 34886 1727204488.33743: in VariableManager get_vars() 34886 1727204488.33778: Calling all_inventory to load vars for managed-node3 34886 1727204488.33781: Calling groups_inventory to load vars for managed-node3 34886 1727204488.33784: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204488.33793: Calling all_plugins_play to load vars for managed-node3 34886 1727204488.33796: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204488.33798: Calling groups_plugins_play to load vars for managed-node3 34886 1727204488.33948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204488.34132: done with get_vars() 34886 1727204488.34140: done getting variables 34886 1727204488.34182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:01:28 -0400 (0:00:00.023) 0:00:06.509 ***** 34886 1727204488.34209: entering _queue_task() for managed-node3/debug 34886 1727204488.34403: worker is 1 (out of 1 available) 34886 1727204488.34419: exiting _queue_task() for managed-node3/debug 34886 1727204488.34432: done queuing things up, now waiting for results queue to drain 34886 1727204488.34435: waiting for pending results... 34886 1727204488.34583: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 34886 1727204488.34659: in run() - task 12b410aa-8751-04b9-2e74-000000000180 34886 1727204488.34676: variable 'ansible_search_path' from source: unknown 34886 1727204488.34679: variable 'ansible_search_path' from source: unknown 34886 1727204488.34709: calling self._execute() 34886 1727204488.34773: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.34778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.34795: variable 'omit' from source: magic vars 34886 1727204488.35087: variable 'ansible_distribution_major_version' from source: facts 34886 1727204488.35099: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204488.35106: variable 'omit' from source: magic vars 34886 1727204488.35149: variable 'omit' from source: magic vars 34886 1727204488.35232: variable 'current_interfaces' from source: set_fact 34886 1727204488.35253: variable 'omit' from source: magic vars 34886 1727204488.35284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204488.35314: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204488.35339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204488.35354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204488.35364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204488.35391: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204488.35395: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.35400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.35485: Set connection var ansible_timeout to 10 34886 1727204488.35492: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204488.35495: Set connection var ansible_connection to ssh 34886 1727204488.35503: Set connection var ansible_shell_executable to /bin/sh 34886 1727204488.35511: Set connection var ansible_pipelining to False 34886 1727204488.35514: Set connection var ansible_shell_type to sh 34886 1727204488.35537: variable 'ansible_shell_executable' from source: unknown 34886 1727204488.35541: variable 'ansible_connection' from source: unknown 34886 1727204488.35543: variable 'ansible_module_compression' from source: unknown 34886 1727204488.35548: variable 'ansible_shell_type' from source: unknown 34886 1727204488.35551: variable 'ansible_shell_executable' from source: unknown 34886 1727204488.35554: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.35568: variable 'ansible_pipelining' from source: unknown 34886 1727204488.35570: variable 'ansible_timeout' from source: unknown 34886 1727204488.35573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.35684: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204488.35693: variable 'omit' from source: magic vars 34886 1727204488.35699: starting attempt loop 34886 1727204488.35702: running the handler 34886 1727204488.35743: handler run complete 34886 1727204488.35756: attempt loop complete, returning result 34886 1727204488.35759: _execute() done 34886 1727204488.35762: dumping result to json 34886 1727204488.35767: done dumping result, returning 34886 1727204488.35780: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [12b410aa-8751-04b9-2e74-000000000180] 34886 1727204488.35785: sending task result for task 12b410aa-8751-04b9-2e74-000000000180 34886 1727204488.35865: done sending task result for task 12b410aa-8751-04b9-2e74-000000000180 34886 1727204488.35868: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 34886 1727204488.35929: no more pending results, returning what we have 34886 1727204488.35932: results queue empty 34886 1727204488.35933: checking for any_errors_fatal 34886 1727204488.35937: done checking for any_errors_fatal 34886 1727204488.35938: checking for max_fail_percentage 34886 1727204488.35940: done checking for max_fail_percentage 34886 1727204488.35941: checking to see if all hosts have failed and the running result is not ok 34886 1727204488.35942: done checking to see if all hosts have failed 34886 1727204488.35943: getting the remaining hosts for this loop 34886 1727204488.35944: done getting the remaining hosts for this loop 34886 1727204488.35947: getting the next task for host managed-node3 34886 1727204488.35955: done getting next task for host managed-node3 34886 1727204488.35958: ^ task is: TASK: Install iproute 34886 1727204488.35961: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204488.35965: getting variables 34886 1727204488.35966: in VariableManager get_vars() 34886 1727204488.36003: Calling all_inventory to load vars for managed-node3 34886 1727204488.36006: Calling groups_inventory to load vars for managed-node3 34886 1727204488.36007: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204488.36015: Calling all_plugins_play to load vars for managed-node3 34886 1727204488.36017: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204488.36020: Calling groups_plugins_play to load vars for managed-node3 34886 1727204488.36194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204488.36376: done with get_vars() 34886 1727204488.36383: done getting variables 34886 1727204488.36429: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:01:28 -0400 (0:00:00.022) 0:00:06.532 ***** 34886 1727204488.36453: entering _queue_task() for managed-node3/package 34886 1727204488.36634: worker is 1 (out of 1 available) 34886 1727204488.36649: exiting _queue_task() for managed-node3/package 34886 1727204488.36661: done queuing things up, now waiting for results queue to drain 34886 1727204488.36663: waiting for pending results... 34886 1727204488.36814: running TaskExecutor() for managed-node3/TASK: Install iproute 34886 1727204488.36879: in run() - task 12b410aa-8751-04b9-2e74-000000000159 34886 1727204488.36899: variable 'ansible_search_path' from source: unknown 34886 1727204488.36904: variable 'ansible_search_path' from source: unknown 34886 1727204488.36934: calling self._execute() 34886 1727204488.36999: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.37003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.37012: variable 'omit' from source: magic vars 34886 1727204488.37299: variable 'ansible_distribution_major_version' from source: facts 34886 1727204488.37310: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204488.37316: variable 'omit' from source: magic vars 34886 1727204488.37352: variable 'omit' from source: magic vars 34886 1727204488.37506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204488.39130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204488.39191: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204488.39227: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204488.39256: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204488.39279: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204488.39360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204488.39383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204488.39409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204488.39447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204488.39460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204488.39547: variable '__network_is_ostree' from source: set_fact 34886 1727204488.39552: variable 'omit' from source: magic vars 34886 1727204488.39574: variable 'omit' from source: magic vars 34886 1727204488.39598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204488.39624: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204488.39643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204488.39658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204488.39669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204488.39696: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204488.39699: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.39704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.39788: Set connection var ansible_timeout to 10 34886 1727204488.39796: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204488.39799: Set connection var ansible_connection to ssh 34886 1727204488.39806: Set connection var ansible_shell_executable to /bin/sh 34886 1727204488.39814: Set connection var ansible_pipelining to False 34886 1727204488.39817: Set connection var ansible_shell_type to sh 34886 1727204488.39844: variable 'ansible_shell_executable' from source: unknown 34886 1727204488.39847: variable 'ansible_connection' from source: unknown 34886 1727204488.39850: variable 'ansible_module_compression' from source: unknown 34886 1727204488.39852: variable 'ansible_shell_type' from source: unknown 34886 1727204488.39855: variable 'ansible_shell_executable' from source: unknown 34886 1727204488.39865: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204488.39868: variable 'ansible_pipelining' from source: unknown 34886 1727204488.39870: variable 'ansible_timeout' from source: unknown 34886 1727204488.39873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204488.39958: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204488.39974: variable 'omit' from source: magic vars 34886 1727204488.39978: starting attempt loop 34886 1727204488.39980: running the handler 34886 1727204488.39986: variable 'ansible_facts' from source: unknown 34886 1727204488.39992: variable 'ansible_facts' from source: unknown 34886 1727204488.40026: _low_level_execute_command(): starting 34886 1727204488.40033: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204488.40563: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204488.40567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.40570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204488.40573: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.40626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204488.40630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204488.40682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204488.43050: stdout chunk (state=3): >>>/root <<< 34886 1727204488.43312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204488.43316: stdout chunk (state=3): >>><<< 34886 1727204488.43318: stderr chunk (state=3): >>><<< 34886 1727204488.43450: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204488.43460: _low_level_execute_command(): starting 34886 1727204488.43463: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644 `" && echo ansible-tmp-1727204488.433488-35437-210581798772644="` echo /root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644 `" ) && sleep 0' 34886 1727204488.44011: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204488.44108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.44141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204488.44162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204488.44185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204488.44270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204488.47063: stdout chunk (state=3): >>>ansible-tmp-1727204488.433488-35437-210581798772644=/root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644 <<< 34886 1727204488.47248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204488.47296: stderr chunk (state=3): >>><<< 34886 1727204488.47300: stdout chunk (state=3): >>><<< 34886 1727204488.47317: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204488.433488-35437-210581798772644=/root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204488.47346: variable 'ansible_module_compression' from source: unknown 34886 1727204488.47409: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 34886 1727204488.47413: ANSIBALLZ: Acquiring lock 34886 1727204488.47416: ANSIBALLZ: Lock acquired: 139734986903328 34886 1727204488.47421: ANSIBALLZ: Creating module 34886 1727204488.60906: ANSIBALLZ: Writing module into payload 34886 1727204488.61098: ANSIBALLZ: Writing module 34886 1727204488.61132: ANSIBALLZ: Renaming module 34886 1727204488.61135: ANSIBALLZ: Done creating module 34886 1727204488.61143: variable 'ansible_facts' from source: unknown 34886 1727204488.61205: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/AnsiballZ_dnf.py 34886 1727204488.61321: Sending initial data 34886 1727204488.61325: Sent initial data (151 bytes) 34886 1727204488.61824: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204488.61828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.61830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204488.61833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.61885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204488.61889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204488.61942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204488.64302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204488.64306: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204488.64338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204488.64383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpqvpgrdkk /root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/AnsiballZ_dnf.py <<< 34886 1727204488.64392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/AnsiballZ_dnf.py" <<< 34886 1727204488.64413: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpqvpgrdkk" to remote "/root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/AnsiballZ_dnf.py" <<< 34886 1727204488.65464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204488.65535: stderr chunk (state=3): >>><<< 34886 1727204488.65539: stdout chunk (state=3): >>><<< 34886 1727204488.65559: done transferring module to remote 34886 1727204488.65570: _low_level_execute_command(): starting 34886 1727204488.65579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/ /root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/AnsiballZ_dnf.py && sleep 0' 34886 1727204488.66045: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204488.66048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.66051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.66109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204488.66113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204488.66157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204488.68885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204488.68935: stderr chunk (state=3): >>><<< 34886 1727204488.68940: stdout chunk (state=3): >>><<< 34886 1727204488.68957: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34886 1727204488.68966: _low_level_execute_command(): starting 34886 1727204488.68969: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/AnsiballZ_dnf.py && sleep 0' 34886 1727204488.69433: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204488.69436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.69439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204488.69441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204488.69443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204488.69500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204488.69504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204488.69549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34886 1727204490.39787: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 34886 1727204490.44695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204490.44765: stderr chunk (state=3): >>><<< 34886 1727204490.44769: stdout chunk (state=3): >>><<< 34886 1727204490.44786: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204490.44834: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204490.44845: _low_level_execute_command(): starting 34886 1727204490.44848: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204488.433488-35437-210581798772644/ > /dev/null 2>&1 && sleep 0' 34886 1727204490.45337: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204490.45340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204490.45343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204490.45346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204490.45395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204490.45399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204490.45407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204490.45441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204490.47404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204490.47460: stderr chunk (state=3): >>><<< 34886 1727204490.47463: stdout chunk (state=3): >>><<< 34886 1727204490.47480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204490.47490: handler run complete 34886 1727204490.47635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204490.47785: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204490.47825: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204490.47851: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204490.47876: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204490.47941: variable '__install_status' from source: unknown 34886 1727204490.47962: Evaluated conditional (__install_status is success): True 34886 1727204490.47978: attempt loop complete, returning result 34886 1727204490.47981: _execute() done 34886 1727204490.47983: dumping result to json 34886 1727204490.47994: done dumping result, returning 34886 1727204490.48005: done running TaskExecutor() for managed-node3/TASK: Install iproute [12b410aa-8751-04b9-2e74-000000000159] 34886 1727204490.48008: sending task result for task 12b410aa-8751-04b9-2e74-000000000159 34886 1727204490.48120: done sending task result for task 12b410aa-8751-04b9-2e74-000000000159 34886 1727204490.48123: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 34886 1727204490.48263: no more pending results, returning what we have 34886 1727204490.48267: results queue empty 34886 1727204490.48268: checking for any_errors_fatal 34886 1727204490.48273: done checking for any_errors_fatal 34886 1727204490.48274: checking for max_fail_percentage 34886 1727204490.48276: done checking for max_fail_percentage 34886 1727204490.48276: checking to see if all hosts have failed and the running result is not ok 34886 1727204490.48277: done checking to see if all hosts have failed 34886 1727204490.48278: getting the remaining hosts for this loop 34886 1727204490.48279: done getting the remaining hosts for this loop 34886 1727204490.48284: getting the next task for host managed-node3 34886 1727204490.48292: done getting next task for host managed-node3 34886 1727204490.48296: ^ task is: TASK: Create veth interface {{ interface }} 34886 1727204490.48299: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204490.48303: getting variables 34886 1727204490.48304: in VariableManager get_vars() 34886 1727204490.48354: Calling all_inventory to load vars for managed-node3 34886 1727204490.48358: Calling groups_inventory to load vars for managed-node3 34886 1727204490.48360: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204490.48372: Calling all_plugins_play to load vars for managed-node3 34886 1727204490.48376: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204490.48379: Calling groups_plugins_play to load vars for managed-node3 34886 1727204490.48613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204490.48799: done with get_vars() 34886 1727204490.48809: done getting variables 34886 1727204490.48861: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204490.48966: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:01:30 -0400 (0:00:02.125) 0:00:08.657 ***** 34886 1727204490.49005: entering _queue_task() for managed-node3/command 34886 1727204490.49226: worker is 1 (out of 1 available) 34886 1727204490.49242: exiting _queue_task() for managed-node3/command 34886 1727204490.49255: done queuing things up, now waiting for results queue to drain 34886 1727204490.49257: waiting for pending results... 34886 1727204490.49422: running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 34886 1727204490.49486: in run() - task 12b410aa-8751-04b9-2e74-00000000015a 34886 1727204490.49504: variable 'ansible_search_path' from source: unknown 34886 1727204490.49508: variable 'ansible_search_path' from source: unknown 34886 1727204490.49740: variable 'interface' from source: play vars 34886 1727204490.49812: variable 'interface' from source: play vars 34886 1727204490.49877: variable 'interface' from source: play vars 34886 1727204490.50083: Loaded config def from plugin (lookup/items) 34886 1727204490.50092: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34886 1727204490.50110: variable 'omit' from source: magic vars 34886 1727204490.50194: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204490.50434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204490.50438: variable 'omit' from source: magic vars 34886 1727204490.50559: variable 'ansible_distribution_major_version' from source: facts 34886 1727204490.50574: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204490.50848: variable 'type' from source: play vars 34886 1727204490.50861: variable 'state' from source: include params 34886 1727204490.50873: variable 'interface' from source: play vars 34886 1727204490.50885: variable 'current_interfaces' from source: set_fact 34886 1727204490.50905: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34886 1727204490.50921: variable 'omit' from source: magic vars 34886 1727204490.50977: variable 'omit' from source: magic vars 34886 1727204490.51050: variable 'item' from source: unknown 34886 1727204490.51150: variable 'item' from source: unknown 34886 1727204490.51178: variable 'omit' from source: magic vars 34886 1727204490.51230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204490.51273: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204490.51305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204490.51342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204490.51363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204490.51407: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204490.51418: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204490.51439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204490.51583: Set connection var ansible_timeout to 10 34886 1727204490.51653: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204490.51656: Set connection var ansible_connection to ssh 34886 1727204490.51659: Set connection var ansible_shell_executable to /bin/sh 34886 1727204490.51662: Set connection var ansible_pipelining to False 34886 1727204490.51664: Set connection var ansible_shell_type to sh 34886 1727204490.51681: variable 'ansible_shell_executable' from source: unknown 34886 1727204490.51694: variable 'ansible_connection' from source: unknown 34886 1727204490.51705: variable 'ansible_module_compression' from source: unknown 34886 1727204490.51714: variable 'ansible_shell_type' from source: unknown 34886 1727204490.51727: variable 'ansible_shell_executable' from source: unknown 34886 1727204490.51738: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204490.51748: variable 'ansible_pipelining' from source: unknown 34886 1727204490.51870: variable 'ansible_timeout' from source: unknown 34886 1727204490.51874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204490.51948: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204490.51969: variable 'omit' from source: magic vars 34886 1727204490.51984: starting attempt loop 34886 1727204490.51995: running the handler 34886 1727204490.52021: _low_level_execute_command(): starting 34886 1727204490.52037: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204490.52867: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204490.52933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204490.52972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204490.53007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204490.54884: stdout chunk (state=3): >>>/root <<< 34886 1727204490.54910: stdout chunk (state=3): >>><<< 34886 1727204490.54933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204490.54937: stderr chunk (state=3): >>><<< 34886 1727204490.54986: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204490.54999: _low_level_execute_command(): starting 34886 1727204490.55030: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366 `" && echo ansible-tmp-1727204490.5496192-35658-193608222922366="` echo /root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366 `" ) && sleep 0' 34886 1727204490.55512: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204490.55632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204490.55641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204490.55660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204490.55735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204490.58499: stdout chunk (state=3): >>>ansible-tmp-1727204490.5496192-35658-193608222922366=/root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366 <<< 34886 1727204490.58502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204490.58505: stdout chunk (state=3): >>><<< 34886 1727204490.58507: stderr chunk (state=3): >>><<< 34886 1727204490.58510: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204490.5496192-35658-193608222922366=/root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204490.58512: variable 'ansible_module_compression' from source: unknown 34886 1727204490.58514: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204490.58652: variable 'ansible_facts' from source: unknown 34886 1727204490.59067: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/AnsiballZ_command.py 34886 1727204490.59225: Sending initial data 34886 1727204490.59235: Sent initial data (156 bytes) 34886 1727204490.59879: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204490.59911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204490.60009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204490.60118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204490.60156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204490.60182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204490.61831: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204490.61861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204490.61935: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpjkoowl_a /root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/AnsiballZ_command.py <<< 34886 1727204490.61939: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/AnsiballZ_command.py" <<< 34886 1727204490.61947: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpjkoowl_a" to remote "/root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/AnsiballZ_command.py" <<< 34886 1727204490.63335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204490.63338: stdout chunk (state=3): >>><<< 34886 1727204490.63341: stderr chunk (state=3): >>><<< 34886 1727204490.63343: done transferring module to remote 34886 1727204490.63345: _low_level_execute_command(): starting 34886 1727204490.63347: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/ /root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/AnsiballZ_command.py && sleep 0' 34886 1727204490.63884: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204490.63907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204490.63922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204490.63945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204490.63965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204490.63978: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204490.63998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204490.64023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34886 1727204490.64106: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204490.64131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204490.64155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204490.64214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204490.64228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204490.66183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204490.66197: stdout chunk (state=3): >>><<< 34886 1727204490.66210: stderr chunk (state=3): >>><<< 34886 1727204490.66238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204490.66248: _low_level_execute_command(): starting 34886 1727204490.66258: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/AnsiballZ_command.py && sleep 0' 34886 1727204490.66940: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204490.66965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204490.66993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204490.67029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204490.67047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204490.67069: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204490.67182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204490.67209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204490.67295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204490.85515: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 15:01:30.845793", "end": "2024-09-24 15:01:30.853432", "delta": "0:00:00.007639", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204490.88991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204490.88997: stdout chunk (state=3): >>><<< 34886 1727204490.88999: stderr chunk (state=3): >>><<< 34886 1727204490.89002: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 15:01:30.845793", "end": "2024-09-24 15:01:30.853432", "delta": "0:00:00.007639", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204490.89005: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204490.89007: _low_level_execute_command(): starting 34886 1727204490.89010: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204490.5496192-35658-193608222922366/ > /dev/null 2>&1 && sleep 0' 34886 1727204490.89745: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204490.90008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204490.90064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204490.90084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204490.90101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204490.90268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204490.93997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204490.94002: stdout chunk (state=3): >>><<< 34886 1727204490.94005: stderr chunk (state=3): >>><<< 34886 1727204490.94007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204490.94010: handler run complete 34886 1727204490.94012: Evaluated conditional (False): False 34886 1727204490.94014: attempt loop complete, returning result 34886 1727204490.94016: variable 'item' from source: unknown 34886 1727204490.94101: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.007639", "end": "2024-09-24 15:01:30.853432", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-24 15:01:30.845793" } 34886 1727204490.95076: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204490.95079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204490.95082: variable 'omit' from source: magic vars 34886 1727204490.95103: variable 'ansible_distribution_major_version' from source: facts 34886 1727204490.95116: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204490.95622: variable 'type' from source: play vars 34886 1727204490.95634: variable 'state' from source: include params 34886 1727204490.95643: variable 'interface' from source: play vars 34886 1727204490.95652: variable 'current_interfaces' from source: set_fact 34886 1727204490.95664: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34886 1727204490.95675: variable 'omit' from source: magic vars 34886 1727204490.95702: variable 'omit' from source: magic vars 34886 1727204490.95761: variable 'item' from source: unknown 34886 1727204490.95850: variable 'item' from source: unknown 34886 1727204490.95872: variable 'omit' from source: magic vars 34886 1727204490.95907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204490.95927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204490.95943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204490.95969: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204490.95978: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204490.95986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204490.96100: Set connection var ansible_timeout to 10 34886 1727204490.96113: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204490.96123: Set connection var ansible_connection to ssh 34886 1727204490.96135: Set connection var ansible_shell_executable to /bin/sh 34886 1727204490.96150: Set connection var ansible_pipelining to False 34886 1727204490.96158: Set connection var ansible_shell_type to sh 34886 1727204490.96191: variable 'ansible_shell_executable' from source: unknown 34886 1727204490.96201: variable 'ansible_connection' from source: unknown 34886 1727204490.96208: variable 'ansible_module_compression' from source: unknown 34886 1727204490.96215: variable 'ansible_shell_type' from source: unknown 34886 1727204490.96225: variable 'ansible_shell_executable' from source: unknown 34886 1727204490.96296: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204490.96299: variable 'ansible_pipelining' from source: unknown 34886 1727204490.96301: variable 'ansible_timeout' from source: unknown 34886 1727204490.96304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204490.96379: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204490.96398: variable 'omit' from source: magic vars 34886 1727204490.96409: starting attempt loop 34886 1727204490.96416: running the handler 34886 1727204490.96433: _low_level_execute_command(): starting 34886 1727204490.96443: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204490.97086: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204490.97107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204490.97127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204490.97149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204490.97267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 34886 1727204490.97286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204490.97309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204490.97391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204490.99067: stdout chunk (state=3): >>>/root <<< 34886 1727204490.99253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204490.99264: stdout chunk (state=3): >>><<< 34886 1727204490.99285: stderr chunk (state=3): >>><<< 34886 1727204490.99312: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204490.99331: _low_level_execute_command(): starting 34886 1727204490.99343: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546 `" && echo ansible-tmp-1727204490.9931958-35658-227475194732546="` echo /root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546 `" ) && sleep 0' 34886 1727204491.00024: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204491.00039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204491.00153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204491.00184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.00254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.02242: stdout chunk (state=3): >>>ansible-tmp-1727204490.9931958-35658-227475194732546=/root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546 <<< 34886 1727204491.02455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.02467: stderr chunk (state=3): >>><<< 34886 1727204491.02471: stdout chunk (state=3): >>><<< 34886 1727204491.02501: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204490.9931958-35658-227475194732546=/root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.02518: variable 'ansible_module_compression' from source: unknown 34886 1727204491.02561: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204491.02571: variable 'ansible_facts' from source: unknown 34886 1727204491.02621: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/AnsiballZ_command.py 34886 1727204491.02741: Sending initial data 34886 1727204491.02744: Sent initial data (156 bytes) 34886 1727204491.03224: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.03228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204491.03231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204491.03233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204491.03235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.03284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.03287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.03327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.04942: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204491.05002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204491.05047: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpjae932nk /root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/AnsiballZ_command.py <<< 34886 1727204491.05051: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/AnsiballZ_command.py" <<< 34886 1727204491.05098: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpjae932nk" to remote "/root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/AnsiballZ_command.py" <<< 34886 1727204491.06297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.06301: stdout chunk (state=3): >>><<< 34886 1727204491.06304: stderr chunk (state=3): >>><<< 34886 1727204491.06306: done transferring module to remote 34886 1727204491.06308: _low_level_execute_command(): starting 34886 1727204491.06311: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/ /root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/AnsiballZ_command.py && sleep 0' 34886 1727204491.06741: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.06758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204491.06781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.06830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204491.06833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.06875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.08839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.08843: stdout chunk (state=3): >>><<< 34886 1727204491.08845: stderr chunk (state=3): >>><<< 34886 1727204491.08951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.08957: _low_level_execute_command(): starting 34886 1727204491.08960: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/AnsiballZ_command.py && sleep 0' 34886 1727204491.09455: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.09469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.09517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.09531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.09581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.27582: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 15:01:31.270896", "end": "2024-09-24 15:01:31.274576", "delta": "0:00:00.003680", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204491.29232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204491.29301: stderr chunk (state=3): >>><<< 34886 1727204491.29305: stdout chunk (state=3): >>><<< 34886 1727204491.29324: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 15:01:31.270896", "end": "2024-09-24 15:01:31.274576", "delta": "0:00:00.003680", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204491.29355: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204491.29361: _low_level_execute_command(): starting 34886 1727204491.29367: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204490.9931958-35658-227475194732546/ > /dev/null 2>&1 && sleep 0' 34886 1727204491.29868: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204491.29871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204491.29874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204491.29876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204491.29879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.29933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.29940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.29979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.31872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.31922: stderr chunk (state=3): >>><<< 34886 1727204491.31926: stdout chunk (state=3): >>><<< 34886 1727204491.31941: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.31948: handler run complete 34886 1727204491.31970: Evaluated conditional (False): False 34886 1727204491.31981: attempt loop complete, returning result 34886 1727204491.32000: variable 'item' from source: unknown 34886 1727204491.32074: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003680", "end": "2024-09-24 15:01:31.274576", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-24 15:01:31.270896" } 34886 1727204491.32205: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204491.32208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204491.32211: variable 'omit' from source: magic vars 34886 1727204491.32348: variable 'ansible_distribution_major_version' from source: facts 34886 1727204491.32355: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204491.32515: variable 'type' from source: play vars 34886 1727204491.32522: variable 'state' from source: include params 34886 1727204491.32525: variable 'interface' from source: play vars 34886 1727204491.32530: variable 'current_interfaces' from source: set_fact 34886 1727204491.32537: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34886 1727204491.32548: variable 'omit' from source: magic vars 34886 1727204491.32560: variable 'omit' from source: magic vars 34886 1727204491.32596: variable 'item' from source: unknown 34886 1727204491.32650: variable 'item' from source: unknown 34886 1727204491.32666: variable 'omit' from source: magic vars 34886 1727204491.32685: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204491.32694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204491.32702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204491.32714: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204491.32717: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204491.32724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204491.32791: Set connection var ansible_timeout to 10 34886 1727204491.32797: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204491.32800: Set connection var ansible_connection to ssh 34886 1727204491.32807: Set connection var ansible_shell_executable to /bin/sh 34886 1727204491.32815: Set connection var ansible_pipelining to False 34886 1727204491.32818: Set connection var ansible_shell_type to sh 34886 1727204491.32839: variable 'ansible_shell_executable' from source: unknown 34886 1727204491.32842: variable 'ansible_connection' from source: unknown 34886 1727204491.32845: variable 'ansible_module_compression' from source: unknown 34886 1727204491.32849: variable 'ansible_shell_type' from source: unknown 34886 1727204491.32852: variable 'ansible_shell_executable' from source: unknown 34886 1727204491.32857: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204491.32862: variable 'ansible_pipelining' from source: unknown 34886 1727204491.32865: variable 'ansible_timeout' from source: unknown 34886 1727204491.32877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204491.32951: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204491.32959: variable 'omit' from source: magic vars 34886 1727204491.32965: starting attempt loop 34886 1727204491.32968: running the handler 34886 1727204491.32975: _low_level_execute_command(): starting 34886 1727204491.32983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204491.33468: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.33472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.33475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204491.33477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204491.33479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.33528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.33532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.33577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.35247: stdout chunk (state=3): >>>/root <<< 34886 1727204491.35354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.35407: stderr chunk (state=3): >>><<< 34886 1727204491.35410: stdout chunk (state=3): >>><<< 34886 1727204491.35427: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.35436: _low_level_execute_command(): starting 34886 1727204491.35441: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416 `" && echo ansible-tmp-1727204491.3542662-35658-88905286701416="` echo /root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416 `" ) && sleep 0' 34886 1727204491.35894: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204491.35898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204491.35901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204491.35903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204491.35905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.35948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204491.35973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.36005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.37970: stdout chunk (state=3): >>>ansible-tmp-1727204491.3542662-35658-88905286701416=/root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416 <<< 34886 1727204491.38081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.38138: stderr chunk (state=3): >>><<< 34886 1727204491.38141: stdout chunk (state=3): >>><<< 34886 1727204491.38157: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204491.3542662-35658-88905286701416=/root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.38182: variable 'ansible_module_compression' from source: unknown 34886 1727204491.38218: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204491.38233: variable 'ansible_facts' from source: unknown 34886 1727204491.38280: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/AnsiballZ_command.py 34886 1727204491.38382: Sending initial data 34886 1727204491.38386: Sent initial data (155 bytes) 34886 1727204491.38855: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204491.38859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204491.38861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204491.38869: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204491.38872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.38921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.38925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.38965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.40572: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204491.40576: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204491.40609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204491.40640: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpsdojc_an /root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/AnsiballZ_command.py <<< 34886 1727204491.40644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/AnsiballZ_command.py" <<< 34886 1727204491.40669: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpsdojc_an" to remote "/root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/AnsiballZ_command.py" <<< 34886 1727204491.41427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.41492: stderr chunk (state=3): >>><<< 34886 1727204491.41496: stdout chunk (state=3): >>><<< 34886 1727204491.41513: done transferring module to remote 34886 1727204491.41524: _low_level_execute_command(): starting 34886 1727204491.41527: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/ /root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/AnsiballZ_command.py && sleep 0' 34886 1727204491.41996: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204491.42000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204491.42002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204491.42005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204491.42007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.42069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.42071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.42106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.44011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.44056: stderr chunk (state=3): >>><<< 34886 1727204491.44060: stdout chunk (state=3): >>><<< 34886 1727204491.44075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.44078: _low_level_execute_command(): starting 34886 1727204491.44084: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/AnsiballZ_command.py && sleep 0' 34886 1727204491.44547: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.44550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.44553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204491.44557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.44605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.44612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.44661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.62833: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 15:01:31.622912", "end": "2024-09-24 15:01:31.626944", "delta": "0:00:00.004032", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204491.64649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204491.64707: stderr chunk (state=3): >>><<< 34886 1727204491.64711: stdout chunk (state=3): >>><<< 34886 1727204491.64732: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 15:01:31.622912", "end": "2024-09-24 15:01:31.626944", "delta": "0:00:00.004032", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204491.64760: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204491.64766: _low_level_execute_command(): starting 34886 1727204491.64772: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204491.3542662-35658-88905286701416/ > /dev/null 2>&1 && sleep 0' 34886 1727204491.65261: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.65265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204491.65269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.65272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.65274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.65324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.65328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.65377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.67327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.67372: stderr chunk (state=3): >>><<< 34886 1727204491.67376: stdout chunk (state=3): >>><<< 34886 1727204491.67392: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.67398: handler run complete 34886 1727204491.67418: Evaluated conditional (False): False 34886 1727204491.67434: attempt loop complete, returning result 34886 1727204491.67453: variable 'item' from source: unknown 34886 1727204491.67523: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.004032", "end": "2024-09-24 15:01:31.626944", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-24 15:01:31.622912" } 34886 1727204491.67659: dumping result to json 34886 1727204491.67662: done dumping result, returning 34886 1727204491.67665: done running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 [12b410aa-8751-04b9-2e74-00000000015a] 34886 1727204491.67668: sending task result for task 12b410aa-8751-04b9-2e74-00000000015a 34886 1727204491.67819: no more pending results, returning what we have 34886 1727204491.67823: results queue empty 34886 1727204491.67824: checking for any_errors_fatal 34886 1727204491.67830: done checking for any_errors_fatal 34886 1727204491.67831: checking for max_fail_percentage 34886 1727204491.67832: done checking for max_fail_percentage 34886 1727204491.67833: checking to see if all hosts have failed and the running result is not ok 34886 1727204491.67834: done checking to see if all hosts have failed 34886 1727204491.67835: getting the remaining hosts for this loop 34886 1727204491.67836: done getting the remaining hosts for this loop 34886 1727204491.67841: getting the next task for host managed-node3 34886 1727204491.67846: done getting next task for host managed-node3 34886 1727204491.67849: ^ task is: TASK: Set up veth as managed by NetworkManager 34886 1727204491.67852: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204491.67860: done sending task result for task 12b410aa-8751-04b9-2e74-00000000015a 34886 1727204491.67871: WORKER PROCESS EXITING 34886 1727204491.67867: getting variables 34886 1727204491.67874: in VariableManager get_vars() 34886 1727204491.67916: Calling all_inventory to load vars for managed-node3 34886 1727204491.67919: Calling groups_inventory to load vars for managed-node3 34886 1727204491.67923: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204491.67933: Calling all_plugins_play to load vars for managed-node3 34886 1727204491.67936: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204491.67940: Calling groups_plugins_play to load vars for managed-node3 34886 1727204491.68105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204491.68294: done with get_vars() 34886 1727204491.68304: done getting variables 34886 1727204491.68355: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:01:31 -0400 (0:00:01.193) 0:00:09.851 ***** 34886 1727204491.68378: entering _queue_task() for managed-node3/command 34886 1727204491.68594: worker is 1 (out of 1 available) 34886 1727204491.68608: exiting _queue_task() for managed-node3/command 34886 1727204491.68620: done queuing things up, now waiting for results queue to drain 34886 1727204491.68622: waiting for pending results... 34886 1727204491.68788: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 34886 1727204491.68871: in run() - task 12b410aa-8751-04b9-2e74-00000000015b 34886 1727204491.68884: variable 'ansible_search_path' from source: unknown 34886 1727204491.68891: variable 'ansible_search_path' from source: unknown 34886 1727204491.68922: calling self._execute() 34886 1727204491.68994: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204491.69002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204491.69011: variable 'omit' from source: magic vars 34886 1727204491.69349: variable 'ansible_distribution_major_version' from source: facts 34886 1727204491.69359: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204491.69491: variable 'type' from source: play vars 34886 1727204491.69495: variable 'state' from source: include params 34886 1727204491.69503: Evaluated conditional (type == 'veth' and state == 'present'): True 34886 1727204491.69513: variable 'omit' from source: magic vars 34886 1727204491.69545: variable 'omit' from source: magic vars 34886 1727204491.69632: variable 'interface' from source: play vars 34886 1727204491.69645: variable 'omit' from source: magic vars 34886 1727204491.69679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204491.69711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204491.69735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204491.69752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204491.69762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204491.69791: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204491.69795: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204491.69800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204491.69890: Set connection var ansible_timeout to 10 34886 1727204491.69897: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204491.69900: Set connection var ansible_connection to ssh 34886 1727204491.69907: Set connection var ansible_shell_executable to /bin/sh 34886 1727204491.69915: Set connection var ansible_pipelining to False 34886 1727204491.69918: Set connection var ansible_shell_type to sh 34886 1727204491.69943: variable 'ansible_shell_executable' from source: unknown 34886 1727204491.69946: variable 'ansible_connection' from source: unknown 34886 1727204491.69949: variable 'ansible_module_compression' from source: unknown 34886 1727204491.69952: variable 'ansible_shell_type' from source: unknown 34886 1727204491.69956: variable 'ansible_shell_executable' from source: unknown 34886 1727204491.69958: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204491.69969: variable 'ansible_pipelining' from source: unknown 34886 1727204491.69972: variable 'ansible_timeout' from source: unknown 34886 1727204491.69974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204491.70094: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204491.70103: variable 'omit' from source: magic vars 34886 1727204491.70109: starting attempt loop 34886 1727204491.70112: running the handler 34886 1727204491.70130: _low_level_execute_command(): starting 34886 1727204491.70138: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204491.70657: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.70688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204491.70694: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.70698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204491.70700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.70760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204491.70768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.70809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.72567: stdout chunk (state=3): >>>/root <<< 34886 1727204491.72675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.72740: stderr chunk (state=3): >>><<< 34886 1727204491.72744: stdout chunk (state=3): >>><<< 34886 1727204491.72767: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.72781: _low_level_execute_command(): starting 34886 1727204491.72788: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933 `" && echo ansible-tmp-1727204491.7276866-35781-137295906768933="` echo /root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933 `" ) && sleep 0' 34886 1727204491.73277: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.73281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.73294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.73296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.73349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.73354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.73395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.75380: stdout chunk (state=3): >>>ansible-tmp-1727204491.7276866-35781-137295906768933=/root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933 <<< 34886 1727204491.75498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.75549: stderr chunk (state=3): >>><<< 34886 1727204491.75553: stdout chunk (state=3): >>><<< 34886 1727204491.75569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204491.7276866-35781-137295906768933=/root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.75601: variable 'ansible_module_compression' from source: unknown 34886 1727204491.75647: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204491.75676: variable 'ansible_facts' from source: unknown 34886 1727204491.75748: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/AnsiballZ_command.py 34886 1727204491.75867: Sending initial data 34886 1727204491.75871: Sent initial data (156 bytes) 34886 1727204491.76341: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204491.76345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204491.76347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204491.76350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.76353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.76407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.76411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.76450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.78046: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204491.78057: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204491.78077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204491.78112: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpims8547_ /root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/AnsiballZ_command.py <<< 34886 1727204491.78126: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/AnsiballZ_command.py" <<< 34886 1727204491.78145: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpims8547_" to remote "/root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/AnsiballZ_command.py" <<< 34886 1727204491.78913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.78977: stderr chunk (state=3): >>><<< 34886 1727204491.78980: stdout chunk (state=3): >>><<< 34886 1727204491.79001: done transferring module to remote 34886 1727204491.79011: _low_level_execute_command(): starting 34886 1727204491.79016: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/ /root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/AnsiballZ_command.py && sleep 0' 34886 1727204491.79470: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.79473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204491.79476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204491.79478: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204491.79484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.79541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.79546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204491.79548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.79581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204491.81432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204491.81479: stderr chunk (state=3): >>><<< 34886 1727204491.81483: stdout chunk (state=3): >>><<< 34886 1727204491.81500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204491.81504: _low_level_execute_command(): starting 34886 1727204491.81509: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/AnsiballZ_command.py && sleep 0' 34886 1727204491.81973: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204491.81976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204491.81979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204491.81983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204491.81985: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204491.82043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204491.82048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204491.82050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204491.82093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.01422: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 15:01:31.992109", "end": "2024-09-24 15:01:32.011436", "delta": "0:00:00.019327", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204492.03047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204492.03109: stderr chunk (state=3): >>><<< 34886 1727204492.03114: stdout chunk (state=3): >>><<< 34886 1727204492.03134: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 15:01:31.992109", "end": "2024-09-24 15:01:32.011436", "delta": "0:00:00.019327", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204492.03172: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204492.03180: _low_level_execute_command(): starting 34886 1727204492.03186: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204491.7276866-35781-137295906768933/ > /dev/null 2>&1 && sleep 0' 34886 1727204492.03682: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204492.03685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.03688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.03701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.03750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.03758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.03798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.05705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.05758: stderr chunk (state=3): >>><<< 34886 1727204492.05761: stdout chunk (state=3): >>><<< 34886 1727204492.05776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204492.05784: handler run complete 34886 1727204492.05810: Evaluated conditional (False): False 34886 1727204492.05825: attempt loop complete, returning result 34886 1727204492.05828: _execute() done 34886 1727204492.05831: dumping result to json 34886 1727204492.05839: done dumping result, returning 34886 1727204492.05847: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [12b410aa-8751-04b9-2e74-00000000015b] 34886 1727204492.05854: sending task result for task 12b410aa-8751-04b9-2e74-00000000015b 34886 1727204492.05966: done sending task result for task 12b410aa-8751-04b9-2e74-00000000015b 34886 1727204492.05969: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.019327", "end": "2024-09-24 15:01:32.011436", "rc": 0, "start": "2024-09-24 15:01:31.992109" } 34886 1727204492.06051: no more pending results, returning what we have 34886 1727204492.06055: results queue empty 34886 1727204492.06056: checking for any_errors_fatal 34886 1727204492.06070: done checking for any_errors_fatal 34886 1727204492.06070: checking for max_fail_percentage 34886 1727204492.06072: done checking for max_fail_percentage 34886 1727204492.06073: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.06074: done checking to see if all hosts have failed 34886 1727204492.06075: getting the remaining hosts for this loop 34886 1727204492.06077: done getting the remaining hosts for this loop 34886 1727204492.06084: getting the next task for host managed-node3 34886 1727204492.06092: done getting next task for host managed-node3 34886 1727204492.06095: ^ task is: TASK: Delete veth interface {{ interface }} 34886 1727204492.06098: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.06101: getting variables 34886 1727204492.06103: in VariableManager get_vars() 34886 1727204492.06146: Calling all_inventory to load vars for managed-node3 34886 1727204492.06149: Calling groups_inventory to load vars for managed-node3 34886 1727204492.06151: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.06162: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.06165: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.06168: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.06370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.06559: done with get_vars() 34886 1727204492.06568: done getting variables 34886 1727204492.06618: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204492.06721: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.383) 0:00:10.235 ***** 34886 1727204492.06749: entering _queue_task() for managed-node3/command 34886 1727204492.06970: worker is 1 (out of 1 available) 34886 1727204492.06986: exiting _queue_task() for managed-node3/command 34886 1727204492.07002: done queuing things up, now waiting for results queue to drain 34886 1727204492.07004: waiting for pending results... 34886 1727204492.07164: running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 34886 1727204492.07240: in run() - task 12b410aa-8751-04b9-2e74-00000000015c 34886 1727204492.07253: variable 'ansible_search_path' from source: unknown 34886 1727204492.07257: variable 'ansible_search_path' from source: unknown 34886 1727204492.07288: calling self._execute() 34886 1727204492.07361: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.07368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.07378: variable 'omit' from source: magic vars 34886 1727204492.07683: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.07695: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.07860: variable 'type' from source: play vars 34886 1727204492.07863: variable 'state' from source: include params 34886 1727204492.07869: variable 'interface' from source: play vars 34886 1727204492.07874: variable 'current_interfaces' from source: set_fact 34886 1727204492.07888: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 34886 1727204492.07893: when evaluation is False, skipping this task 34886 1727204492.07896: _execute() done 34886 1727204492.07899: dumping result to json 34886 1727204492.07902: done dumping result, returning 34886 1727204492.07904: done running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 [12b410aa-8751-04b9-2e74-00000000015c] 34886 1727204492.07911: sending task result for task 12b410aa-8751-04b9-2e74-00000000015c 34886 1727204492.07995: done sending task result for task 12b410aa-8751-04b9-2e74-00000000015c 34886 1727204492.07998: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204492.08055: no more pending results, returning what we have 34886 1727204492.08058: results queue empty 34886 1727204492.08059: checking for any_errors_fatal 34886 1727204492.08065: done checking for any_errors_fatal 34886 1727204492.08066: checking for max_fail_percentage 34886 1727204492.08068: done checking for max_fail_percentage 34886 1727204492.08069: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.08070: done checking to see if all hosts have failed 34886 1727204492.08071: getting the remaining hosts for this loop 34886 1727204492.08073: done getting the remaining hosts for this loop 34886 1727204492.08076: getting the next task for host managed-node3 34886 1727204492.08082: done getting next task for host managed-node3 34886 1727204492.08085: ^ task is: TASK: Create dummy interface {{ interface }} 34886 1727204492.08096: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.08100: getting variables 34886 1727204492.08102: in VariableManager get_vars() 34886 1727204492.08142: Calling all_inventory to load vars for managed-node3 34886 1727204492.08145: Calling groups_inventory to load vars for managed-node3 34886 1727204492.08147: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.08155: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.08157: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.08159: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.08313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.08494: done with get_vars() 34886 1727204492.08503: done getting variables 34886 1727204492.08554: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204492.08644: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.019) 0:00:10.254 ***** 34886 1727204492.08668: entering _queue_task() for managed-node3/command 34886 1727204492.08868: worker is 1 (out of 1 available) 34886 1727204492.08884: exiting _queue_task() for managed-node3/command 34886 1727204492.08900: done queuing things up, now waiting for results queue to drain 34886 1727204492.08903: waiting for pending results... 34886 1727204492.09052: running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 34886 1727204492.09129: in run() - task 12b410aa-8751-04b9-2e74-00000000015d 34886 1727204492.09142: variable 'ansible_search_path' from source: unknown 34886 1727204492.09146: variable 'ansible_search_path' from source: unknown 34886 1727204492.09175: calling self._execute() 34886 1727204492.09239: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.09255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.09258: variable 'omit' from source: magic vars 34886 1727204492.09537: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.09547: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.09716: variable 'type' from source: play vars 34886 1727204492.09723: variable 'state' from source: include params 34886 1727204492.09727: variable 'interface' from source: play vars 34886 1727204492.09738: variable 'current_interfaces' from source: set_fact 34886 1727204492.09746: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 34886 1727204492.09749: when evaluation is False, skipping this task 34886 1727204492.09752: _execute() done 34886 1727204492.09756: dumping result to json 34886 1727204492.09761: done dumping result, returning 34886 1727204492.09767: done running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 [12b410aa-8751-04b9-2e74-00000000015d] 34886 1727204492.09774: sending task result for task 12b410aa-8751-04b9-2e74-00000000015d 34886 1727204492.09863: done sending task result for task 12b410aa-8751-04b9-2e74-00000000015d 34886 1727204492.09867: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204492.09921: no more pending results, returning what we have 34886 1727204492.09924: results queue empty 34886 1727204492.09925: checking for any_errors_fatal 34886 1727204492.09930: done checking for any_errors_fatal 34886 1727204492.09931: checking for max_fail_percentage 34886 1727204492.09933: done checking for max_fail_percentage 34886 1727204492.09934: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.09935: done checking to see if all hosts have failed 34886 1727204492.09936: getting the remaining hosts for this loop 34886 1727204492.09937: done getting the remaining hosts for this loop 34886 1727204492.09941: getting the next task for host managed-node3 34886 1727204492.09946: done getting next task for host managed-node3 34886 1727204492.09948: ^ task is: TASK: Delete dummy interface {{ interface }} 34886 1727204492.09952: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.09955: getting variables 34886 1727204492.09957: in VariableManager get_vars() 34886 1727204492.09993: Calling all_inventory to load vars for managed-node3 34886 1727204492.09996: Calling groups_inventory to load vars for managed-node3 34886 1727204492.09998: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.10008: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.10011: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.10015: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.10204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.10384: done with get_vars() 34886 1727204492.10393: done getting variables 34886 1727204492.10439: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204492.10522: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.018) 0:00:10.273 ***** 34886 1727204492.10545: entering _queue_task() for managed-node3/command 34886 1727204492.10734: worker is 1 (out of 1 available) 34886 1727204492.10750: exiting _queue_task() for managed-node3/command 34886 1727204492.10763: done queuing things up, now waiting for results queue to drain 34886 1727204492.10764: waiting for pending results... 34886 1727204492.10905: running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 34886 1727204492.10976: in run() - task 12b410aa-8751-04b9-2e74-00000000015e 34886 1727204492.10988: variable 'ansible_search_path' from source: unknown 34886 1727204492.10991: variable 'ansible_search_path' from source: unknown 34886 1727204492.11025: calling self._execute() 34886 1727204492.11083: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.11091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.11102: variable 'omit' from source: magic vars 34886 1727204492.11378: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.11388: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.11553: variable 'type' from source: play vars 34886 1727204492.11561: variable 'state' from source: include params 34886 1727204492.11564: variable 'interface' from source: play vars 34886 1727204492.11569: variable 'current_interfaces' from source: set_fact 34886 1727204492.11578: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 34886 1727204492.11581: when evaluation is False, skipping this task 34886 1727204492.11584: _execute() done 34886 1727204492.11588: dumping result to json 34886 1727204492.11593: done dumping result, returning 34886 1727204492.11601: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 [12b410aa-8751-04b9-2e74-00000000015e] 34886 1727204492.11607: sending task result for task 12b410aa-8751-04b9-2e74-00000000015e 34886 1727204492.11693: done sending task result for task 12b410aa-8751-04b9-2e74-00000000015e 34886 1727204492.11696: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204492.11749: no more pending results, returning what we have 34886 1727204492.11752: results queue empty 34886 1727204492.11753: checking for any_errors_fatal 34886 1727204492.11757: done checking for any_errors_fatal 34886 1727204492.11758: checking for max_fail_percentage 34886 1727204492.11760: done checking for max_fail_percentage 34886 1727204492.11761: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.11762: done checking to see if all hosts have failed 34886 1727204492.11763: getting the remaining hosts for this loop 34886 1727204492.11764: done getting the remaining hosts for this loop 34886 1727204492.11767: getting the next task for host managed-node3 34886 1727204492.11772: done getting next task for host managed-node3 34886 1727204492.11776: ^ task is: TASK: Create tap interface {{ interface }} 34886 1727204492.11779: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.11783: getting variables 34886 1727204492.11784: in VariableManager get_vars() 34886 1727204492.11824: Calling all_inventory to load vars for managed-node3 34886 1727204492.11827: Calling groups_inventory to load vars for managed-node3 34886 1727204492.11828: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.11836: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.11838: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.11841: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.11992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.12172: done with get_vars() 34886 1727204492.12180: done getting variables 34886 1727204492.12229: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204492.12311: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.017) 0:00:10.291 ***** 34886 1727204492.12336: entering _queue_task() for managed-node3/command 34886 1727204492.12522: worker is 1 (out of 1 available) 34886 1727204492.12538: exiting _queue_task() for managed-node3/command 34886 1727204492.12551: done queuing things up, now waiting for results queue to drain 34886 1727204492.12553: waiting for pending results... 34886 1727204492.12691: running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 34886 1727204492.12763: in run() - task 12b410aa-8751-04b9-2e74-00000000015f 34886 1727204492.12777: variable 'ansible_search_path' from source: unknown 34886 1727204492.12780: variable 'ansible_search_path' from source: unknown 34886 1727204492.12812: calling self._execute() 34886 1727204492.12874: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.12880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.12894: variable 'omit' from source: magic vars 34886 1727204492.13213: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.13230: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.13393: variable 'type' from source: play vars 34886 1727204492.13397: variable 'state' from source: include params 34886 1727204492.13403: variable 'interface' from source: play vars 34886 1727204492.13408: variable 'current_interfaces' from source: set_fact 34886 1727204492.13416: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 34886 1727204492.13421: when evaluation is False, skipping this task 34886 1727204492.13424: _execute() done 34886 1727204492.13427: dumping result to json 34886 1727204492.13429: done dumping result, returning 34886 1727204492.13436: done running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 [12b410aa-8751-04b9-2e74-00000000015f] 34886 1727204492.13447: sending task result for task 12b410aa-8751-04b9-2e74-00000000015f 34886 1727204492.13529: done sending task result for task 12b410aa-8751-04b9-2e74-00000000015f 34886 1727204492.13532: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204492.13591: no more pending results, returning what we have 34886 1727204492.13595: results queue empty 34886 1727204492.13596: checking for any_errors_fatal 34886 1727204492.13601: done checking for any_errors_fatal 34886 1727204492.13602: checking for max_fail_percentage 34886 1727204492.13603: done checking for max_fail_percentage 34886 1727204492.13604: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.13605: done checking to see if all hosts have failed 34886 1727204492.13606: getting the remaining hosts for this loop 34886 1727204492.13607: done getting the remaining hosts for this loop 34886 1727204492.13610: getting the next task for host managed-node3 34886 1727204492.13616: done getting next task for host managed-node3 34886 1727204492.13621: ^ task is: TASK: Delete tap interface {{ interface }} 34886 1727204492.13624: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.13628: getting variables 34886 1727204492.13629: in VariableManager get_vars() 34886 1727204492.13659: Calling all_inventory to load vars for managed-node3 34886 1727204492.13661: Calling groups_inventory to load vars for managed-node3 34886 1727204492.13663: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.13670: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.13672: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.13675: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.13853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.14033: done with get_vars() 34886 1727204492.14041: done getting variables 34886 1727204492.14086: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204492.14165: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.018) 0:00:10.309 ***** 34886 1727204492.14188: entering _queue_task() for managed-node3/command 34886 1727204492.14369: worker is 1 (out of 1 available) 34886 1727204492.14383: exiting _queue_task() for managed-node3/command 34886 1727204492.14398: done queuing things up, now waiting for results queue to drain 34886 1727204492.14400: waiting for pending results... 34886 1727204492.14542: running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 34886 1727204492.14611: in run() - task 12b410aa-8751-04b9-2e74-000000000160 34886 1727204492.14631: variable 'ansible_search_path' from source: unknown 34886 1727204492.14636: variable 'ansible_search_path' from source: unknown 34886 1727204492.14659: calling self._execute() 34886 1727204492.14721: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.14725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.14736: variable 'omit' from source: magic vars 34886 1727204492.15006: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.15016: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.15178: variable 'type' from source: play vars 34886 1727204492.15182: variable 'state' from source: include params 34886 1727204492.15193: variable 'interface' from source: play vars 34886 1727204492.15196: variable 'current_interfaces' from source: set_fact 34886 1727204492.15203: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 34886 1727204492.15206: when evaluation is False, skipping this task 34886 1727204492.15209: _execute() done 34886 1727204492.15212: dumping result to json 34886 1727204492.15217: done dumping result, returning 34886 1727204492.15224: done running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 [12b410aa-8751-04b9-2e74-000000000160] 34886 1727204492.15230: sending task result for task 12b410aa-8751-04b9-2e74-000000000160 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204492.15365: no more pending results, returning what we have 34886 1727204492.15369: results queue empty 34886 1727204492.15370: checking for any_errors_fatal 34886 1727204492.15374: done checking for any_errors_fatal 34886 1727204492.15375: checking for max_fail_percentage 34886 1727204492.15377: done checking for max_fail_percentage 34886 1727204492.15378: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.15379: done checking to see if all hosts have failed 34886 1727204492.15380: getting the remaining hosts for this loop 34886 1727204492.15381: done getting the remaining hosts for this loop 34886 1727204492.15384: getting the next task for host managed-node3 34886 1727204492.15392: done getting next task for host managed-node3 34886 1727204492.15395: ^ task is: TASK: Set up gateway ip on veth peer 34886 1727204492.15398: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.15403: getting variables 34886 1727204492.15404: in VariableManager get_vars() 34886 1727204492.15440: Calling all_inventory to load vars for managed-node3 34886 1727204492.15442: Calling groups_inventory to load vars for managed-node3 34886 1727204492.15444: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.15453: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.15455: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.15458: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.15604: done sending task result for task 12b410aa-8751-04b9-2e74-000000000160 34886 1727204492.15607: WORKER PROCESS EXITING 34886 1727204492.15621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.15817: done with get_vars() 34886 1727204492.15828: done getting variables 34886 1727204492.15903: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.017) 0:00:10.327 ***** 34886 1727204492.15924: entering _queue_task() for managed-node3/shell 34886 1727204492.15926: Creating lock for shell 34886 1727204492.16113: worker is 1 (out of 1 available) 34886 1727204492.16132: exiting _queue_task() for managed-node3/shell 34886 1727204492.16145: done queuing things up, now waiting for results queue to drain 34886 1727204492.16146: waiting for pending results... 34886 1727204492.16285: running TaskExecutor() for managed-node3/TASK: Set up gateway ip on veth peer 34886 1727204492.16356: in run() - task 12b410aa-8751-04b9-2e74-00000000000d 34886 1727204492.16370: variable 'ansible_search_path' from source: unknown 34886 1727204492.16407: calling self._execute() 34886 1727204492.16475: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.16482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.16498: variable 'omit' from source: magic vars 34886 1727204492.16786: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.16799: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.16805: variable 'omit' from source: magic vars 34886 1727204492.16834: variable 'omit' from source: magic vars 34886 1727204492.16946: variable 'interface' from source: play vars 34886 1727204492.16962: variable 'omit' from source: magic vars 34886 1727204492.16999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204492.17036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204492.17052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204492.17068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204492.17080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204492.17109: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204492.17112: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.17118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.17208: Set connection var ansible_timeout to 10 34886 1727204492.17214: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204492.17217: Set connection var ansible_connection to ssh 34886 1727204492.17227: Set connection var ansible_shell_executable to /bin/sh 34886 1727204492.17235: Set connection var ansible_pipelining to False 34886 1727204492.17238: Set connection var ansible_shell_type to sh 34886 1727204492.17261: variable 'ansible_shell_executable' from source: unknown 34886 1727204492.17268: variable 'ansible_connection' from source: unknown 34886 1727204492.17271: variable 'ansible_module_compression' from source: unknown 34886 1727204492.17274: variable 'ansible_shell_type' from source: unknown 34886 1727204492.17279: variable 'ansible_shell_executable' from source: unknown 34886 1727204492.17282: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.17287: variable 'ansible_pipelining' from source: unknown 34886 1727204492.17293: variable 'ansible_timeout' from source: unknown 34886 1727204492.17299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.17421: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204492.17430: variable 'omit' from source: magic vars 34886 1727204492.17436: starting attempt loop 34886 1727204492.17439: running the handler 34886 1727204492.17450: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204492.17472: _low_level_execute_command(): starting 34886 1727204492.17476: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204492.18011: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.18015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.18017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.18023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.18077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.18080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204492.18087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.18128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.19840: stdout chunk (state=3): >>>/root <<< 34886 1727204492.19948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.20004: stderr chunk (state=3): >>><<< 34886 1727204492.20007: stdout chunk (state=3): >>><<< 34886 1727204492.20028: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204492.20039: _low_level_execute_command(): starting 34886 1727204492.20045: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454 `" && echo ansible-tmp-1727204492.2002742-35794-120921878561454="` echo /root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454 `" ) && sleep 0' 34886 1727204492.20510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.20516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.20525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204492.20528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204492.20531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.20575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.20579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.20622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.22608: stdout chunk (state=3): >>>ansible-tmp-1727204492.2002742-35794-120921878561454=/root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454 <<< 34886 1727204492.22726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.22775: stderr chunk (state=3): >>><<< 34886 1727204492.22778: stdout chunk (state=3): >>><<< 34886 1727204492.22796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204492.2002742-35794-120921878561454=/root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204492.22826: variable 'ansible_module_compression' from source: unknown 34886 1727204492.22866: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204492.22896: variable 'ansible_facts' from source: unknown 34886 1727204492.22966: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/AnsiballZ_command.py 34886 1727204492.23081: Sending initial data 34886 1727204492.23085: Sent initial data (156 bytes) 34886 1727204492.23555: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204492.23559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.23561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.23564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.23622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.23628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.23664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.25282: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204492.25298: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204492.25321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204492.25356: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpvy86oos7 /root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/AnsiballZ_command.py <<< 34886 1727204492.25360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/AnsiballZ_command.py" <<< 34886 1727204492.25387: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpvy86oos7" to remote "/root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/AnsiballZ_command.py" <<< 34886 1727204492.26152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.26223: stderr chunk (state=3): >>><<< 34886 1727204492.26227: stdout chunk (state=3): >>><<< 34886 1727204492.26243: done transferring module to remote 34886 1727204492.26254: _low_level_execute_command(): starting 34886 1727204492.26259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/ /root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/AnsiballZ_command.py && sleep 0' 34886 1727204492.26725: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.26729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.26731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204492.26733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.26793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.26798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.26841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.28656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.28708: stderr chunk (state=3): >>><<< 34886 1727204492.28711: stdout chunk (state=3): >>><<< 34886 1727204492.28729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204492.28733: _low_level_execute_command(): starting 34886 1727204492.28739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/AnsiballZ_command.py && sleep 0' 34886 1727204492.29191: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.29195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204492.29198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204492.29200: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204492.29202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.29260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204492.29263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.29302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.49321: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-24 15:01:32.466520", "end": "2024-09-24 15:01:32.491744", "delta": "0:00:00.025224", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204492.51044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204492.51108: stderr chunk (state=3): >>><<< 34886 1727204492.51112: stdout chunk (state=3): >>><<< 34886 1727204492.51133: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-24 15:01:32.466520", "end": "2024-09-24 15:01:32.491744", "delta": "0:00:00.025224", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204492.51170: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204492.51180: _low_level_execute_command(): starting 34886 1727204492.51186: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204492.2002742-35794-120921878561454/ > /dev/null 2>&1 && sleep 0' 34886 1727204492.51680: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204492.51683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204492.51694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204492.51697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204492.51699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.51750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204492.51758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.51792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.53686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.53744: stderr chunk (state=3): >>><<< 34886 1727204492.53749: stdout chunk (state=3): >>><<< 34886 1727204492.53765: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204492.53773: handler run complete 34886 1727204492.53796: Evaluated conditional (False): False 34886 1727204492.53809: attempt loop complete, returning result 34886 1727204492.53813: _execute() done 34886 1727204492.53815: dumping result to json 34886 1727204492.53824: done dumping result, returning 34886 1727204492.53831: done running TaskExecutor() for managed-node3/TASK: Set up gateway ip on veth peer [12b410aa-8751-04b9-2e74-00000000000d] 34886 1727204492.53837: sending task result for task 12b410aa-8751-04b9-2e74-00000000000d 34886 1727204492.53948: done sending task result for task 12b410aa-8751-04b9-2e74-00000000000d 34886 1727204492.53952: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.025224", "end": "2024-09-24 15:01:32.491744", "rc": 0, "start": "2024-09-24 15:01:32.466520" } 34886 1727204492.54028: no more pending results, returning what we have 34886 1727204492.54032: results queue empty 34886 1727204492.54033: checking for any_errors_fatal 34886 1727204492.54039: done checking for any_errors_fatal 34886 1727204492.54040: checking for max_fail_percentage 34886 1727204492.54042: done checking for max_fail_percentage 34886 1727204492.54042: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.54044: done checking to see if all hosts have failed 34886 1727204492.54044: getting the remaining hosts for this loop 34886 1727204492.54046: done getting the remaining hosts for this loop 34886 1727204492.54051: getting the next task for host managed-node3 34886 1727204492.54057: done getting next task for host managed-node3 34886 1727204492.54062: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 34886 1727204492.54064: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.54067: getting variables 34886 1727204492.54069: in VariableManager get_vars() 34886 1727204492.54111: Calling all_inventory to load vars for managed-node3 34886 1727204492.54114: Calling groups_inventory to load vars for managed-node3 34886 1727204492.54117: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.54130: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.54133: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.54136: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.54332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.54514: done with get_vars() 34886 1727204492.54527: done getting variables 34886 1727204492.54575: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.386) 0:00:10.713 ***** 34886 1727204492.54599: entering _queue_task() for managed-node3/debug 34886 1727204492.54822: worker is 1 (out of 1 available) 34886 1727204492.54836: exiting _queue_task() for managed-node3/debug 34886 1727204492.54850: done queuing things up, now waiting for results queue to drain 34886 1727204492.54852: waiting for pending results... 34886 1727204492.55021: running TaskExecutor() for managed-node3/TASK: TEST: I can configure an interface with static ipv6 config 34886 1727204492.55087: in run() - task 12b410aa-8751-04b9-2e74-00000000000f 34886 1727204492.55105: variable 'ansible_search_path' from source: unknown 34886 1727204492.55137: calling self._execute() 34886 1727204492.55209: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.55216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.55226: variable 'omit' from source: magic vars 34886 1727204492.55526: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.55536: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.55542: variable 'omit' from source: magic vars 34886 1727204492.55560: variable 'omit' from source: magic vars 34886 1727204492.55591: variable 'omit' from source: magic vars 34886 1727204492.55630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204492.55660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204492.55967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204492.55986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204492.55998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204492.56028: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204492.56031: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.56036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.56126: Set connection var ansible_timeout to 10 34886 1727204492.56129: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204492.56132: Set connection var ansible_connection to ssh 34886 1727204492.56139: Set connection var ansible_shell_executable to /bin/sh 34886 1727204492.56147: Set connection var ansible_pipelining to False 34886 1727204492.56150: Set connection var ansible_shell_type to sh 34886 1727204492.56175: variable 'ansible_shell_executable' from source: unknown 34886 1727204492.56180: variable 'ansible_connection' from source: unknown 34886 1727204492.56183: variable 'ansible_module_compression' from source: unknown 34886 1727204492.56186: variable 'ansible_shell_type' from source: unknown 34886 1727204492.56188: variable 'ansible_shell_executable' from source: unknown 34886 1727204492.56196: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.56198: variable 'ansible_pipelining' from source: unknown 34886 1727204492.56201: variable 'ansible_timeout' from source: unknown 34886 1727204492.56203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.56324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204492.56332: variable 'omit' from source: magic vars 34886 1727204492.56338: starting attempt loop 34886 1727204492.56341: running the handler 34886 1727204492.56381: handler run complete 34886 1727204492.56402: attempt loop complete, returning result 34886 1727204492.56405: _execute() done 34886 1727204492.56409: dumping result to json 34886 1727204492.56412: done dumping result, returning 34886 1727204492.56423: done running TaskExecutor() for managed-node3/TASK: TEST: I can configure an interface with static ipv6 config [12b410aa-8751-04b9-2e74-00000000000f] 34886 1727204492.56426: sending task result for task 12b410aa-8751-04b9-2e74-00000000000f 34886 1727204492.56514: done sending task result for task 12b410aa-8751-04b9-2e74-00000000000f 34886 1727204492.56517: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 34886 1727204492.56568: no more pending results, returning what we have 34886 1727204492.56571: results queue empty 34886 1727204492.56572: checking for any_errors_fatal 34886 1727204492.56578: done checking for any_errors_fatal 34886 1727204492.56579: checking for max_fail_percentage 34886 1727204492.56581: done checking for max_fail_percentage 34886 1727204492.56582: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.56583: done checking to see if all hosts have failed 34886 1727204492.56584: getting the remaining hosts for this loop 34886 1727204492.56585: done getting the remaining hosts for this loop 34886 1727204492.56592: getting the next task for host managed-node3 34886 1727204492.56598: done getting next task for host managed-node3 34886 1727204492.56603: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34886 1727204492.56606: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.56623: getting variables 34886 1727204492.56625: in VariableManager get_vars() 34886 1727204492.56664: Calling all_inventory to load vars for managed-node3 34886 1727204492.56667: Calling groups_inventory to load vars for managed-node3 34886 1727204492.56670: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.56683: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.56685: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.56688: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.57047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.57232: done with get_vars() 34886 1727204492.57240: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.027) 0:00:10.740 ***** 34886 1727204492.57312: entering _queue_task() for managed-node3/include_tasks 34886 1727204492.57511: worker is 1 (out of 1 available) 34886 1727204492.57528: exiting _queue_task() for managed-node3/include_tasks 34886 1727204492.57541: done queuing things up, now waiting for results queue to drain 34886 1727204492.57543: waiting for pending results... 34886 1727204492.57704: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34886 1727204492.57806: in run() - task 12b410aa-8751-04b9-2e74-000000000017 34886 1727204492.57818: variable 'ansible_search_path' from source: unknown 34886 1727204492.57825: variable 'ansible_search_path' from source: unknown 34886 1727204492.57851: calling self._execute() 34886 1727204492.58095: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.58099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.58102: variable 'omit' from source: magic vars 34886 1727204492.58357: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.58377: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.58391: _execute() done 34886 1727204492.58403: dumping result to json 34886 1727204492.58412: done dumping result, returning 34886 1727204492.58426: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-04b9-2e74-000000000017] 34886 1727204492.58438: sending task result for task 12b410aa-8751-04b9-2e74-000000000017 34886 1727204492.58595: no more pending results, returning what we have 34886 1727204492.58600: in VariableManager get_vars() 34886 1727204492.58660: Calling all_inventory to load vars for managed-node3 34886 1727204492.58664: Calling groups_inventory to load vars for managed-node3 34886 1727204492.58666: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.58680: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.58684: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.58692: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.58930: done sending task result for task 12b410aa-8751-04b9-2e74-000000000017 34886 1727204492.58934: WORKER PROCESS EXITING 34886 1727204492.58961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.59278: done with get_vars() 34886 1727204492.59288: variable 'ansible_search_path' from source: unknown 34886 1727204492.59292: variable 'ansible_search_path' from source: unknown 34886 1727204492.59340: we have included files to process 34886 1727204492.59342: generating all_blocks data 34886 1727204492.59344: done generating all_blocks data 34886 1727204492.59350: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34886 1727204492.59352: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34886 1727204492.59355: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34886 1727204492.60262: done processing included file 34886 1727204492.60264: iterating over new_blocks loaded from include file 34886 1727204492.60266: in VariableManager get_vars() 34886 1727204492.60301: done with get_vars() 34886 1727204492.60303: filtering new block on tags 34886 1727204492.60325: done filtering new block on tags 34886 1727204492.60329: in VariableManager get_vars() 34886 1727204492.60361: done with get_vars() 34886 1727204492.60364: filtering new block on tags 34886 1727204492.60394: done filtering new block on tags 34886 1727204492.60397: in VariableManager get_vars() 34886 1727204492.60427: done with get_vars() 34886 1727204492.60430: filtering new block on tags 34886 1727204492.60455: done filtering new block on tags 34886 1727204492.60457: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 34886 1727204492.60463: extending task lists for all hosts with included blocks 34886 1727204492.61597: done extending task lists 34886 1727204492.61599: done processing included files 34886 1727204492.61600: results queue empty 34886 1727204492.61601: checking for any_errors_fatal 34886 1727204492.61604: done checking for any_errors_fatal 34886 1727204492.61606: checking for max_fail_percentage 34886 1727204492.61607: done checking for max_fail_percentage 34886 1727204492.61608: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.61609: done checking to see if all hosts have failed 34886 1727204492.61610: getting the remaining hosts for this loop 34886 1727204492.61611: done getting the remaining hosts for this loop 34886 1727204492.61615: getting the next task for host managed-node3 34886 1727204492.61619: done getting next task for host managed-node3 34886 1727204492.61623: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34886 1727204492.61627: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.61638: getting variables 34886 1727204492.61639: in VariableManager get_vars() 34886 1727204492.61659: Calling all_inventory to load vars for managed-node3 34886 1727204492.61661: Calling groups_inventory to load vars for managed-node3 34886 1727204492.61664: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.61670: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.61674: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.61678: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.61960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.62294: done with get_vars() 34886 1727204492.62306: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.050) 0:00:10.791 ***** 34886 1727204492.62401: entering _queue_task() for managed-node3/setup 34886 1727204492.62698: worker is 1 (out of 1 available) 34886 1727204492.62711: exiting _queue_task() for managed-node3/setup 34886 1727204492.62724: done queuing things up, now waiting for results queue to drain 34886 1727204492.62726: waiting for pending results... 34886 1727204492.63022: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34886 1727204492.63172: in run() - task 12b410aa-8751-04b9-2e74-0000000001fc 34886 1727204492.63187: variable 'ansible_search_path' from source: unknown 34886 1727204492.63193: variable 'ansible_search_path' from source: unknown 34886 1727204492.63227: calling self._execute() 34886 1727204492.63295: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.63301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.63315: variable 'omit' from source: magic vars 34886 1727204492.63617: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.63628: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.63815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204492.65485: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204492.65543: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204492.65574: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204492.65608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204492.65632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204492.65695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204492.65734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204492.65756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204492.65788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204492.65802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204492.65853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204492.65872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204492.65897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204492.65930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204492.65949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204492.66069: variable '__network_required_facts' from source: role '' defaults 34886 1727204492.66076: variable 'ansible_facts' from source: unknown 34886 1727204492.66153: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 34886 1727204492.66157: when evaluation is False, skipping this task 34886 1727204492.66162: _execute() done 34886 1727204492.66165: dumping result to json 34886 1727204492.66167: done dumping result, returning 34886 1727204492.66179: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-04b9-2e74-0000000001fc] 34886 1727204492.66182: sending task result for task 12b410aa-8751-04b9-2e74-0000000001fc 34886 1727204492.66264: done sending task result for task 12b410aa-8751-04b9-2e74-0000000001fc 34886 1727204492.66266: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204492.66331: no more pending results, returning what we have 34886 1727204492.66335: results queue empty 34886 1727204492.66336: checking for any_errors_fatal 34886 1727204492.66337: done checking for any_errors_fatal 34886 1727204492.66338: checking for max_fail_percentage 34886 1727204492.66340: done checking for max_fail_percentage 34886 1727204492.66340: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.66341: done checking to see if all hosts have failed 34886 1727204492.66342: getting the remaining hosts for this loop 34886 1727204492.66344: done getting the remaining hosts for this loop 34886 1727204492.66348: getting the next task for host managed-node3 34886 1727204492.66357: done getting next task for host managed-node3 34886 1727204492.66362: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 34886 1727204492.66367: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.66381: getting variables 34886 1727204492.66382: in VariableManager get_vars() 34886 1727204492.66433: Calling all_inventory to load vars for managed-node3 34886 1727204492.66437: Calling groups_inventory to load vars for managed-node3 34886 1727204492.66439: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.66449: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.66452: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.66456: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.66621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.66831: done with get_vars() 34886 1727204492.66842: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.045) 0:00:10.836 ***** 34886 1727204492.66922: entering _queue_task() for managed-node3/stat 34886 1727204492.67126: worker is 1 (out of 1 available) 34886 1727204492.67142: exiting _queue_task() for managed-node3/stat 34886 1727204492.67155: done queuing things up, now waiting for results queue to drain 34886 1727204492.67157: waiting for pending results... 34886 1727204492.67324: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 34886 1727204492.67443: in run() - task 12b410aa-8751-04b9-2e74-0000000001fe 34886 1727204492.67455: variable 'ansible_search_path' from source: unknown 34886 1727204492.67459: variable 'ansible_search_path' from source: unknown 34886 1727204492.67493: calling self._execute() 34886 1727204492.67562: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.67568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.67578: variable 'omit' from source: magic vars 34886 1727204492.67895: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.67906: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.68053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204492.68358: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204492.68415: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204492.68470: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204492.68517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204492.68635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204492.68672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204492.68729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204492.68769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204492.68909: variable '__network_is_ostree' from source: set_fact 34886 1727204492.68912: Evaluated conditional (not __network_is_ostree is defined): False 34886 1727204492.68994: when evaluation is False, skipping this task 34886 1727204492.68998: _execute() done 34886 1727204492.69001: dumping result to json 34886 1727204492.69004: done dumping result, returning 34886 1727204492.69007: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-04b9-2e74-0000000001fe] 34886 1727204492.69009: sending task result for task 12b410aa-8751-04b9-2e74-0000000001fe 34886 1727204492.69083: done sending task result for task 12b410aa-8751-04b9-2e74-0000000001fe skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34886 1727204492.69340: no more pending results, returning what we have 34886 1727204492.69344: results queue empty 34886 1727204492.69345: checking for any_errors_fatal 34886 1727204492.69351: done checking for any_errors_fatal 34886 1727204492.69352: checking for max_fail_percentage 34886 1727204492.69353: done checking for max_fail_percentage 34886 1727204492.69354: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.69355: done checking to see if all hosts have failed 34886 1727204492.69357: getting the remaining hosts for this loop 34886 1727204492.69358: done getting the remaining hosts for this loop 34886 1727204492.69362: getting the next task for host managed-node3 34886 1727204492.69368: done getting next task for host managed-node3 34886 1727204492.69372: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34886 1727204492.69376: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.69394: getting variables 34886 1727204492.69396: in VariableManager get_vars() 34886 1727204492.69441: Calling all_inventory to load vars for managed-node3 34886 1727204492.69446: Calling groups_inventory to load vars for managed-node3 34886 1727204492.69449: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.69458: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.69462: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.69466: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.69637: WORKER PROCESS EXITING 34886 1727204492.69652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.69856: done with get_vars() 34886 1727204492.69865: done getting variables 34886 1727204492.69911: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.030) 0:00:10.867 ***** 34886 1727204492.69943: entering _queue_task() for managed-node3/set_fact 34886 1727204492.70143: worker is 1 (out of 1 available) 34886 1727204492.70156: exiting _queue_task() for managed-node3/set_fact 34886 1727204492.70169: done queuing things up, now waiting for results queue to drain 34886 1727204492.70171: waiting for pending results... 34886 1727204492.70413: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34886 1727204492.70447: in run() - task 12b410aa-8751-04b9-2e74-0000000001ff 34886 1727204492.70461: variable 'ansible_search_path' from source: unknown 34886 1727204492.70465: variable 'ansible_search_path' from source: unknown 34886 1727204492.70497: calling self._execute() 34886 1727204492.70566: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.70573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.70584: variable 'omit' from source: magic vars 34886 1727204492.70888: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.70900: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.71041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204492.71312: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204492.71349: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204492.71382: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204492.71411: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204492.71537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204492.71575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204492.71622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204492.71694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204492.71763: variable '__network_is_ostree' from source: set_fact 34886 1727204492.71777: Evaluated conditional (not __network_is_ostree is defined): False 34886 1727204492.71786: when evaluation is False, skipping this task 34886 1727204492.71797: _execute() done 34886 1727204492.71807: dumping result to json 34886 1727204492.71825: done dumping result, returning 34886 1727204492.71895: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-04b9-2e74-0000000001ff] 34886 1727204492.71898: sending task result for task 12b410aa-8751-04b9-2e74-0000000001ff 34886 1727204492.71968: done sending task result for task 12b410aa-8751-04b9-2e74-0000000001ff 34886 1727204492.71972: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34886 1727204492.72026: no more pending results, returning what we have 34886 1727204492.72031: results queue empty 34886 1727204492.72032: checking for any_errors_fatal 34886 1727204492.72037: done checking for any_errors_fatal 34886 1727204492.72038: checking for max_fail_percentage 34886 1727204492.72040: done checking for max_fail_percentage 34886 1727204492.72040: checking to see if all hosts have failed and the running result is not ok 34886 1727204492.72042: done checking to see if all hosts have failed 34886 1727204492.72042: getting the remaining hosts for this loop 34886 1727204492.72044: done getting the remaining hosts for this loop 34886 1727204492.72048: getting the next task for host managed-node3 34886 1727204492.72057: done getting next task for host managed-node3 34886 1727204492.72061: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 34886 1727204492.72066: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204492.72080: getting variables 34886 1727204492.72082: in VariableManager get_vars() 34886 1727204492.72122: Calling all_inventory to load vars for managed-node3 34886 1727204492.72126: Calling groups_inventory to load vars for managed-node3 34886 1727204492.72128: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204492.72138: Calling all_plugins_play to load vars for managed-node3 34886 1727204492.72141: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204492.72144: Calling groups_plugins_play to load vars for managed-node3 34886 1727204492.72451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204492.73027: done with get_vars() 34886 1727204492.73037: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:01:32 -0400 (0:00:00.033) 0:00:10.900 ***** 34886 1727204492.73259: entering _queue_task() for managed-node3/service_facts 34886 1727204492.73262: Creating lock for service_facts 34886 1727204492.73649: worker is 1 (out of 1 available) 34886 1727204492.73660: exiting _queue_task() for managed-node3/service_facts 34886 1727204492.73672: done queuing things up, now waiting for results queue to drain 34886 1727204492.73674: waiting for pending results... 34886 1727204492.73910: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 34886 1727204492.74089: in run() - task 12b410aa-8751-04b9-2e74-000000000201 34886 1727204492.74096: variable 'ansible_search_path' from source: unknown 34886 1727204492.74100: variable 'ansible_search_path' from source: unknown 34886 1727204492.74138: calling self._execute() 34886 1727204492.74310: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.74314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.74318: variable 'omit' from source: magic vars 34886 1727204492.74735: variable 'ansible_distribution_major_version' from source: facts 34886 1727204492.74762: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204492.74777: variable 'omit' from source: magic vars 34886 1727204492.74887: variable 'omit' from source: magic vars 34886 1727204492.74939: variable 'omit' from source: magic vars 34886 1727204492.74999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204492.75046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204492.75082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204492.75115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204492.75136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204492.75207: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204492.75210: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.75213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.75344: Set connection var ansible_timeout to 10 34886 1727204492.75358: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204492.75398: Set connection var ansible_connection to ssh 34886 1727204492.75401: Set connection var ansible_shell_executable to /bin/sh 34886 1727204492.75404: Set connection var ansible_pipelining to False 34886 1727204492.75406: Set connection var ansible_shell_type to sh 34886 1727204492.75445: variable 'ansible_shell_executable' from source: unknown 34886 1727204492.75455: variable 'ansible_connection' from source: unknown 34886 1727204492.75507: variable 'ansible_module_compression' from source: unknown 34886 1727204492.75511: variable 'ansible_shell_type' from source: unknown 34886 1727204492.75513: variable 'ansible_shell_executable' from source: unknown 34886 1727204492.75515: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204492.75518: variable 'ansible_pipelining' from source: unknown 34886 1727204492.75520: variable 'ansible_timeout' from source: unknown 34886 1727204492.75522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204492.75765: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204492.75784: variable 'omit' from source: magic vars 34886 1727204492.75799: starting attempt loop 34886 1727204492.75808: running the handler 34886 1727204492.75944: _low_level_execute_command(): starting 34886 1727204492.75947: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204492.76638: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204492.76653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204492.76715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.76797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.76820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204492.76872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.76910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.78673: stdout chunk (state=3): >>>/root <<< 34886 1727204492.78886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.78892: stdout chunk (state=3): >>><<< 34886 1727204492.78895: stderr chunk (state=3): >>><<< 34886 1727204492.78925: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204492.78955: _low_level_execute_command(): starting 34886 1727204492.78996: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627 `" && echo ansible-tmp-1727204492.7893283-35815-72296098575627="` echo /root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627 `" ) && sleep 0' 34886 1727204492.79753: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.79836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.79875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204492.79902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.79983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.81936: stdout chunk (state=3): >>>ansible-tmp-1727204492.7893283-35815-72296098575627=/root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627 <<< 34886 1727204492.82099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.82102: stdout chunk (state=3): >>><<< 34886 1727204492.82105: stderr chunk (state=3): >>><<< 34886 1727204492.82116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204492.7893283-35815-72296098575627=/root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204492.82161: variable 'ansible_module_compression' from source: unknown 34886 1727204492.82198: ANSIBALLZ: Using lock for service_facts 34886 1727204492.82203: ANSIBALLZ: Acquiring lock 34886 1727204492.82206: ANSIBALLZ: Lock acquired: 139734992016768 34886 1727204492.82209: ANSIBALLZ: Creating module 34886 1727204492.95596: ANSIBALLZ: Writing module into payload 34886 1727204492.95681: ANSIBALLZ: Writing module 34886 1727204492.95699: ANSIBALLZ: Renaming module 34886 1727204492.95705: ANSIBALLZ: Done creating module 34886 1727204492.95720: variable 'ansible_facts' from source: unknown 34886 1727204492.95776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/AnsiballZ_service_facts.py 34886 1727204492.95892: Sending initial data 34886 1727204492.95896: Sent initial data (161 bytes) 34886 1727204492.96356: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.96395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.96398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.96446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204492.98162: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204492.98168: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204492.98199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204492.98234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpdcuh9os2 /root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/AnsiballZ_service_facts.py <<< 34886 1727204492.98237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/AnsiballZ_service_facts.py" <<< 34886 1727204492.98266: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpdcuh9os2" to remote "/root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/AnsiballZ_service_facts.py" <<< 34886 1727204492.99064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204492.99127: stderr chunk (state=3): >>><<< 34886 1727204492.99131: stdout chunk (state=3): >>><<< 34886 1727204492.99149: done transferring module to remote 34886 1727204492.99161: _low_level_execute_command(): starting 34886 1727204492.99171: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/ /root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/AnsiballZ_service_facts.py && sleep 0' 34886 1727204492.99627: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.99630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.99633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204492.99635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204492.99693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204492.99696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204492.99743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204493.01615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204493.01661: stderr chunk (state=3): >>><<< 34886 1727204493.01664: stdout chunk (state=3): >>><<< 34886 1727204493.01683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204493.01687: _low_level_execute_command(): starting 34886 1727204493.01690: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/AnsiballZ_service_facts.py && sleep 0' 34886 1727204493.02147: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204493.02151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204493.02154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204493.02157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204493.02207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204493.02215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204493.02260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204494.95841: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "s<<< 34886 1727204494.95855: stdout chunk (state=3): >>>tate": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "<<< 34886 1727204494.95871: stdout chunk (state=3): >>>udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", <<< 34886 1727204494.95902: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "sys<<< 34886 1727204494.95910: stdout chunk (state=3): >>>temd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 34886 1727204494.97527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204494.97593: stderr chunk (state=3): >>><<< 34886 1727204494.97598: stdout chunk (state=3): >>><<< 34886 1727204494.97623: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204494.98214: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204494.98230: _low_level_execute_command(): starting 34886 1727204494.98233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204492.7893283-35815-72296098575627/ > /dev/null 2>&1 && sleep 0' 34886 1727204494.98734: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204494.98738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204494.98741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204494.98743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204494.98745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204494.98798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204494.98803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204494.98838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204495.00721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204495.00778: stderr chunk (state=3): >>><<< 34886 1727204495.00782: stdout chunk (state=3): >>><<< 34886 1727204495.00799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204495.00807: handler run complete 34886 1727204495.00977: variable 'ansible_facts' from source: unknown 34886 1727204495.01114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204495.01530: variable 'ansible_facts' from source: unknown 34886 1727204495.02554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204495.02753: attempt loop complete, returning result 34886 1727204495.02759: _execute() done 34886 1727204495.02762: dumping result to json 34886 1727204495.02812: done dumping result, returning 34886 1727204495.02825: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-04b9-2e74-000000000201] 34886 1727204495.02828: sending task result for task 12b410aa-8751-04b9-2e74-000000000201 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204495.03572: no more pending results, returning what we have 34886 1727204495.03575: results queue empty 34886 1727204495.03576: checking for any_errors_fatal 34886 1727204495.03581: done checking for any_errors_fatal 34886 1727204495.03582: checking for max_fail_percentage 34886 1727204495.03583: done checking for max_fail_percentage 34886 1727204495.03584: checking to see if all hosts have failed and the running result is not ok 34886 1727204495.03585: done checking to see if all hosts have failed 34886 1727204495.03586: getting the remaining hosts for this loop 34886 1727204495.03588: done getting the remaining hosts for this loop 34886 1727204495.03594: getting the next task for host managed-node3 34886 1727204495.03600: done getting next task for host managed-node3 34886 1727204495.03603: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 34886 1727204495.03608: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204495.03630: done sending task result for task 12b410aa-8751-04b9-2e74-000000000201 34886 1727204495.03634: WORKER PROCESS EXITING 34886 1727204495.03639: getting variables 34886 1727204495.03640: in VariableManager get_vars() 34886 1727204495.03670: Calling all_inventory to load vars for managed-node3 34886 1727204495.03672: Calling groups_inventory to load vars for managed-node3 34886 1727204495.03674: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204495.03681: Calling all_plugins_play to load vars for managed-node3 34886 1727204495.03683: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204495.03686: Calling groups_plugins_play to load vars for managed-node3 34886 1727204495.04043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204495.04500: done with get_vars() 34886 1727204495.04514: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:01:35 -0400 (0:00:02.313) 0:00:13.213 ***** 34886 1727204495.04597: entering _queue_task() for managed-node3/package_facts 34886 1727204495.04603: Creating lock for package_facts 34886 1727204495.04861: worker is 1 (out of 1 available) 34886 1727204495.04877: exiting _queue_task() for managed-node3/package_facts 34886 1727204495.04893: done queuing things up, now waiting for results queue to drain 34886 1727204495.04896: waiting for pending results... 34886 1727204495.05073: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 34886 1727204495.05193: in run() - task 12b410aa-8751-04b9-2e74-000000000202 34886 1727204495.05207: variable 'ansible_search_path' from source: unknown 34886 1727204495.05211: variable 'ansible_search_path' from source: unknown 34886 1727204495.05247: calling self._execute() 34886 1727204495.05314: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204495.05323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204495.05333: variable 'omit' from source: magic vars 34886 1727204495.05651: variable 'ansible_distribution_major_version' from source: facts 34886 1727204495.05663: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204495.05674: variable 'omit' from source: magic vars 34886 1727204495.05737: variable 'omit' from source: magic vars 34886 1727204495.05767: variable 'omit' from source: magic vars 34886 1727204495.05806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204495.05841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204495.05859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204495.05875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204495.05891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204495.05923: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204495.05926: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204495.05929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204495.06018: Set connection var ansible_timeout to 10 34886 1727204495.06024: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204495.06027: Set connection var ansible_connection to ssh 34886 1727204495.06034: Set connection var ansible_shell_executable to /bin/sh 34886 1727204495.06043: Set connection var ansible_pipelining to False 34886 1727204495.06046: Set connection var ansible_shell_type to sh 34886 1727204495.06068: variable 'ansible_shell_executable' from source: unknown 34886 1727204495.06071: variable 'ansible_connection' from source: unknown 34886 1727204495.06075: variable 'ansible_module_compression' from source: unknown 34886 1727204495.06077: variable 'ansible_shell_type' from source: unknown 34886 1727204495.06082: variable 'ansible_shell_executable' from source: unknown 34886 1727204495.06085: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204495.06092: variable 'ansible_pipelining' from source: unknown 34886 1727204495.06096: variable 'ansible_timeout' from source: unknown 34886 1727204495.06103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204495.06270: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204495.06281: variable 'omit' from source: magic vars 34886 1727204495.06286: starting attempt loop 34886 1727204495.06291: running the handler 34886 1727204495.06306: _low_level_execute_command(): starting 34886 1727204495.06314: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204495.06880: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204495.06884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204495.06887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204495.06893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204495.06947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204495.06954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204495.06956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204495.06996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204495.08660: stdout chunk (state=3): >>>/root <<< 34886 1727204495.08767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204495.08826: stderr chunk (state=3): >>><<< 34886 1727204495.08829: stdout chunk (state=3): >>><<< 34886 1727204495.08852: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204495.08863: _low_level_execute_command(): starting 34886 1727204495.08869: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842 `" && echo ansible-tmp-1727204495.0885093-35881-234768530339842="` echo /root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842 `" ) && sleep 0' 34886 1727204495.09354: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204495.09360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204495.09362: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204495.09372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204495.09375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204495.09425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204495.09429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204495.09471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204495.11408: stdout chunk (state=3): >>>ansible-tmp-1727204495.0885093-35881-234768530339842=/root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842 <<< 34886 1727204495.11529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204495.11585: stderr chunk (state=3): >>><<< 34886 1727204495.11591: stdout chunk (state=3): >>><<< 34886 1727204495.11608: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204495.0885093-35881-234768530339842=/root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204495.11659: variable 'ansible_module_compression' from source: unknown 34886 1727204495.11704: ANSIBALLZ: Using lock for package_facts 34886 1727204495.11707: ANSIBALLZ: Acquiring lock 34886 1727204495.11710: ANSIBALLZ: Lock acquired: 139734982072688 34886 1727204495.11715: ANSIBALLZ: Creating module 34886 1727204495.51099: ANSIBALLZ: Writing module into payload 34886 1727204495.51270: ANSIBALLZ: Writing module 34886 1727204495.51315: ANSIBALLZ: Renaming module 34886 1727204495.51331: ANSIBALLZ: Done creating module 34886 1727204495.51384: variable 'ansible_facts' from source: unknown 34886 1727204495.51626: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/AnsiballZ_package_facts.py 34886 1727204495.51898: Sending initial data 34886 1727204495.51901: Sent initial data (162 bytes) 34886 1727204495.52577: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204495.52619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204495.52638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204495.52666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204495.52734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204495.54487: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204495.54530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204495.54577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp4i_p7egy /root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/AnsiballZ_package_facts.py <<< 34886 1727204495.54581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/AnsiballZ_package_facts.py" <<< 34886 1727204495.54624: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp4i_p7egy" to remote "/root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/AnsiballZ_package_facts.py" <<< 34886 1727204495.56961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204495.57234: stderr chunk (state=3): >>><<< 34886 1727204495.57238: stdout chunk (state=3): >>><<< 34886 1727204495.57240: done transferring module to remote 34886 1727204495.57242: _low_level_execute_command(): starting 34886 1727204495.57245: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/ /root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/AnsiballZ_package_facts.py && sleep 0' 34886 1727204495.57836: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204495.57852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204495.57909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204495.57993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204495.58021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204495.58057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204495.58096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204495.58135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204495.60112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204495.60115: stderr chunk (state=3): >>><<< 34886 1727204495.60118: stdout chunk (state=3): >>><<< 34886 1727204495.60121: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204495.60123: _low_level_execute_command(): starting 34886 1727204495.60125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/AnsiballZ_package_facts.py && sleep 0' 34886 1727204495.60827: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204495.60839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204495.60851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204495.60879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204495.60895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204495.60903: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204495.60982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204495.61034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204495.61048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204495.61124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204495.61138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204496.25961: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 34886 1727204496.26038: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 34886 1727204496.26054: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 34886 1727204496.26131: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 34886 1727204496.26180: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 34886 1727204496.26236: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 34886 1727204496.26242: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 34886 1727204496.28212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204496.28215: stderr chunk (state=3): >>><<< 34886 1727204496.28218: stdout chunk (state=3): >>><<< 34886 1727204496.28415: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204496.32571: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204496.32613: _low_level_execute_command(): starting 34886 1727204496.32630: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204495.0885093-35881-234768530339842/ > /dev/null 2>&1 && sleep 0' 34886 1727204496.33296: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204496.33315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204496.33336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204496.33359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204496.33407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204496.33478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204496.33501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204496.33517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204496.33599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204496.35911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204496.35926: stdout chunk (state=3): >>><<< 34886 1727204496.35939: stderr chunk (state=3): >>><<< 34886 1727204496.35961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204496.36017: handler run complete 34886 1727204496.37573: variable 'ansible_facts' from source: unknown 34886 1727204496.38418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204496.42299: variable 'ansible_facts' from source: unknown 34886 1727204496.46627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204496.47945: attempt loop complete, returning result 34886 1727204496.48005: _execute() done 34886 1727204496.48008: dumping result to json 34886 1727204496.48161: done dumping result, returning 34886 1727204496.48172: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-04b9-2e74-000000000202] 34886 1727204496.48175: sending task result for task 12b410aa-8751-04b9-2e74-000000000202 34886 1727204496.50415: done sending task result for task 12b410aa-8751-04b9-2e74-000000000202 34886 1727204496.50418: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204496.50515: no more pending results, returning what we have 34886 1727204496.50518: results queue empty 34886 1727204496.50522: checking for any_errors_fatal 34886 1727204496.50527: done checking for any_errors_fatal 34886 1727204496.50528: checking for max_fail_percentage 34886 1727204496.50530: done checking for max_fail_percentage 34886 1727204496.50531: checking to see if all hosts have failed and the running result is not ok 34886 1727204496.50532: done checking to see if all hosts have failed 34886 1727204496.50532: getting the remaining hosts for this loop 34886 1727204496.50534: done getting the remaining hosts for this loop 34886 1727204496.50538: getting the next task for host managed-node3 34886 1727204496.50545: done getting next task for host managed-node3 34886 1727204496.50550: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34886 1727204496.50554: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204496.50566: getting variables 34886 1727204496.50568: in VariableManager get_vars() 34886 1727204496.50608: Calling all_inventory to load vars for managed-node3 34886 1727204496.50612: Calling groups_inventory to load vars for managed-node3 34886 1727204496.50616: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204496.50628: Calling all_plugins_play to load vars for managed-node3 34886 1727204496.50632: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204496.50636: Calling groups_plugins_play to load vars for managed-node3 34886 1727204496.52169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204496.54316: done with get_vars() 34886 1727204496.54343: done getting variables 34886 1727204496.54418: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:36 -0400 (0:00:01.498) 0:00:14.712 ***** 34886 1727204496.54460: entering _queue_task() for managed-node3/debug 34886 1727204496.54800: worker is 1 (out of 1 available) 34886 1727204496.54815: exiting _queue_task() for managed-node3/debug 34886 1727204496.54830: done queuing things up, now waiting for results queue to drain 34886 1727204496.54832: waiting for pending results... 34886 1727204496.55212: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 34886 1727204496.55270: in run() - task 12b410aa-8751-04b9-2e74-000000000018 34886 1727204496.55287: variable 'ansible_search_path' from source: unknown 34886 1727204496.55293: variable 'ansible_search_path' from source: unknown 34886 1727204496.55337: calling self._execute() 34886 1727204496.55458: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204496.55462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204496.55465: variable 'omit' from source: magic vars 34886 1727204496.55830: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.55841: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204496.55847: variable 'omit' from source: magic vars 34886 1727204496.55900: variable 'omit' from source: magic vars 34886 1727204496.55982: variable 'network_provider' from source: set_fact 34886 1727204496.56001: variable 'omit' from source: magic vars 34886 1727204496.56040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204496.56069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204496.56087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204496.56111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204496.56121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204496.56151: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204496.56155: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204496.56158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204496.56250: Set connection var ansible_timeout to 10 34886 1727204496.56256: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204496.56260: Set connection var ansible_connection to ssh 34886 1727204496.56267: Set connection var ansible_shell_executable to /bin/sh 34886 1727204496.56275: Set connection var ansible_pipelining to False 34886 1727204496.56278: Set connection var ansible_shell_type to sh 34886 1727204496.56301: variable 'ansible_shell_executable' from source: unknown 34886 1727204496.56305: variable 'ansible_connection' from source: unknown 34886 1727204496.56309: variable 'ansible_module_compression' from source: unknown 34886 1727204496.56312: variable 'ansible_shell_type' from source: unknown 34886 1727204496.56315: variable 'ansible_shell_executable' from source: unknown 34886 1727204496.56318: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204496.56330: variable 'ansible_pipelining' from source: unknown 34886 1727204496.56333: variable 'ansible_timeout' from source: unknown 34886 1727204496.56335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204496.56460: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204496.56471: variable 'omit' from source: magic vars 34886 1727204496.56477: starting attempt loop 34886 1727204496.56482: running the handler 34886 1727204496.56526: handler run complete 34886 1727204496.56549: attempt loop complete, returning result 34886 1727204496.56552: _execute() done 34886 1727204496.56555: dumping result to json 34886 1727204496.56557: done dumping result, returning 34886 1727204496.56560: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-04b9-2e74-000000000018] 34886 1727204496.56566: sending task result for task 12b410aa-8751-04b9-2e74-000000000018 34886 1727204496.56653: done sending task result for task 12b410aa-8751-04b9-2e74-000000000018 34886 1727204496.56656: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 34886 1727204496.56720: no more pending results, returning what we have 34886 1727204496.56724: results queue empty 34886 1727204496.56725: checking for any_errors_fatal 34886 1727204496.56737: done checking for any_errors_fatal 34886 1727204496.56738: checking for max_fail_percentage 34886 1727204496.56740: done checking for max_fail_percentage 34886 1727204496.56741: checking to see if all hosts have failed and the running result is not ok 34886 1727204496.56742: done checking to see if all hosts have failed 34886 1727204496.56742: getting the remaining hosts for this loop 34886 1727204496.56744: done getting the remaining hosts for this loop 34886 1727204496.56749: getting the next task for host managed-node3 34886 1727204496.56756: done getting next task for host managed-node3 34886 1727204496.56760: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34886 1727204496.56763: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204496.56774: getting variables 34886 1727204496.56775: in VariableManager get_vars() 34886 1727204496.56820: Calling all_inventory to load vars for managed-node3 34886 1727204496.56823: Calling groups_inventory to load vars for managed-node3 34886 1727204496.56825: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204496.56835: Calling all_plugins_play to load vars for managed-node3 34886 1727204496.56838: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204496.56841: Calling groups_plugins_play to load vars for managed-node3 34886 1727204496.58609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204496.60187: done with get_vars() 34886 1727204496.60215: done getting variables 34886 1727204496.60263: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:36 -0400 (0:00:00.058) 0:00:14.770 ***** 34886 1727204496.60293: entering _queue_task() for managed-node3/fail 34886 1727204496.60533: worker is 1 (out of 1 available) 34886 1727204496.60548: exiting _queue_task() for managed-node3/fail 34886 1727204496.60563: done queuing things up, now waiting for results queue to drain 34886 1727204496.60565: waiting for pending results... 34886 1727204496.60745: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34886 1727204496.60851: in run() - task 12b410aa-8751-04b9-2e74-000000000019 34886 1727204496.60864: variable 'ansible_search_path' from source: unknown 34886 1727204496.60868: variable 'ansible_search_path' from source: unknown 34886 1727204496.60903: calling self._execute() 34886 1727204496.60974: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204496.60980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204496.60991: variable 'omit' from source: magic vars 34886 1727204496.61304: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.61315: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204496.61423: variable 'network_state' from source: role '' defaults 34886 1727204496.61436: Evaluated conditional (network_state != {}): False 34886 1727204496.61440: when evaluation is False, skipping this task 34886 1727204496.61443: _execute() done 34886 1727204496.61446: dumping result to json 34886 1727204496.61450: done dumping result, returning 34886 1727204496.61462: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-04b9-2e74-000000000019] 34886 1727204496.61466: sending task result for task 12b410aa-8751-04b9-2e74-000000000019 34886 1727204496.61567: done sending task result for task 12b410aa-8751-04b9-2e74-000000000019 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204496.61621: no more pending results, returning what we have 34886 1727204496.61625: results queue empty 34886 1727204496.61627: checking for any_errors_fatal 34886 1727204496.61633: done checking for any_errors_fatal 34886 1727204496.61634: checking for max_fail_percentage 34886 1727204496.61636: done checking for max_fail_percentage 34886 1727204496.61636: checking to see if all hosts have failed and the running result is not ok 34886 1727204496.61637: done checking to see if all hosts have failed 34886 1727204496.61638: getting the remaining hosts for this loop 34886 1727204496.61639: done getting the remaining hosts for this loop 34886 1727204496.61644: getting the next task for host managed-node3 34886 1727204496.61650: done getting next task for host managed-node3 34886 1727204496.61654: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34886 1727204496.61658: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204496.61674: getting variables 34886 1727204496.61676: in VariableManager get_vars() 34886 1727204496.61715: Calling all_inventory to load vars for managed-node3 34886 1727204496.61718: Calling groups_inventory to load vars for managed-node3 34886 1727204496.61721: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204496.61731: Calling all_plugins_play to load vars for managed-node3 34886 1727204496.61734: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204496.61738: Calling groups_plugins_play to load vars for managed-node3 34886 1727204496.62260: WORKER PROCESS EXITING 34886 1727204496.63068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204496.64643: done with get_vars() 34886 1727204496.64666: done getting variables 34886 1727204496.64723: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:36 -0400 (0:00:00.044) 0:00:14.815 ***** 34886 1727204496.64751: entering _queue_task() for managed-node3/fail 34886 1727204496.65013: worker is 1 (out of 1 available) 34886 1727204496.65031: exiting _queue_task() for managed-node3/fail 34886 1727204496.65044: done queuing things up, now waiting for results queue to drain 34886 1727204496.65046: waiting for pending results... 34886 1727204496.65226: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34886 1727204496.65325: in run() - task 12b410aa-8751-04b9-2e74-00000000001a 34886 1727204496.65338: variable 'ansible_search_path' from source: unknown 34886 1727204496.65342: variable 'ansible_search_path' from source: unknown 34886 1727204496.65378: calling self._execute() 34886 1727204496.65453: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204496.65459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204496.65469: variable 'omit' from source: magic vars 34886 1727204496.65792: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.65802: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204496.65910: variable 'network_state' from source: role '' defaults 34886 1727204496.65922: Evaluated conditional (network_state != {}): False 34886 1727204496.65926: when evaluation is False, skipping this task 34886 1727204496.65930: _execute() done 34886 1727204496.65933: dumping result to json 34886 1727204496.65935: done dumping result, returning 34886 1727204496.65946: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-04b9-2e74-00000000001a] 34886 1727204496.65949: sending task result for task 12b410aa-8751-04b9-2e74-00000000001a 34886 1727204496.66048: done sending task result for task 12b410aa-8751-04b9-2e74-00000000001a 34886 1727204496.66051: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204496.66113: no more pending results, returning what we have 34886 1727204496.66118: results queue empty 34886 1727204496.66121: checking for any_errors_fatal 34886 1727204496.66133: done checking for any_errors_fatal 34886 1727204496.66134: checking for max_fail_percentage 34886 1727204496.66136: done checking for max_fail_percentage 34886 1727204496.66137: checking to see if all hosts have failed and the running result is not ok 34886 1727204496.66138: done checking to see if all hosts have failed 34886 1727204496.66139: getting the remaining hosts for this loop 34886 1727204496.66140: done getting the remaining hosts for this loop 34886 1727204496.66145: getting the next task for host managed-node3 34886 1727204496.66151: done getting next task for host managed-node3 34886 1727204496.66156: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34886 1727204496.66159: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204496.66177: getting variables 34886 1727204496.66179: in VariableManager get_vars() 34886 1727204496.66218: Calling all_inventory to load vars for managed-node3 34886 1727204496.66224: Calling groups_inventory to load vars for managed-node3 34886 1727204496.66227: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204496.66236: Calling all_plugins_play to load vars for managed-node3 34886 1727204496.66239: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204496.66243: Calling groups_plugins_play to load vars for managed-node3 34886 1727204496.67439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204496.69126: done with get_vars() 34886 1727204496.69147: done getting variables 34886 1727204496.69201: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:36 -0400 (0:00:00.044) 0:00:14.860 ***** 34886 1727204496.69229: entering _queue_task() for managed-node3/fail 34886 1727204496.70252: worker is 1 (out of 1 available) 34886 1727204496.70268: exiting _queue_task() for managed-node3/fail 34886 1727204496.70282: done queuing things up, now waiting for results queue to drain 34886 1727204496.70284: waiting for pending results... 34886 1727204496.70459: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34886 1727204496.70558: in run() - task 12b410aa-8751-04b9-2e74-00000000001b 34886 1727204496.70569: variable 'ansible_search_path' from source: unknown 34886 1727204496.70573: variable 'ansible_search_path' from source: unknown 34886 1727204496.70607: calling self._execute() 34886 1727204496.70678: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204496.70684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204496.70695: variable 'omit' from source: magic vars 34886 1727204496.71004: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.71014: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204496.71164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204496.72885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204496.72947: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204496.72979: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204496.73010: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204496.73040: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204496.73103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204496.73129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204496.73154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204496.73190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204496.73204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204496.73283: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.73297: Evaluated conditional (ansible_distribution_major_version | int > 9): True 34886 1727204496.73393: variable 'ansible_distribution' from source: facts 34886 1727204496.73397: variable '__network_rh_distros' from source: role '' defaults 34886 1727204496.73407: Evaluated conditional (ansible_distribution in __network_rh_distros): False 34886 1727204496.73410: when evaluation is False, skipping this task 34886 1727204496.73413: _execute() done 34886 1727204496.73418: dumping result to json 34886 1727204496.73424: done dumping result, returning 34886 1727204496.73430: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-04b9-2e74-00000000001b] 34886 1727204496.73436: sending task result for task 12b410aa-8751-04b9-2e74-00000000001b 34886 1727204496.73529: done sending task result for task 12b410aa-8751-04b9-2e74-00000000001b 34886 1727204496.73532: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 34886 1727204496.73614: no more pending results, returning what we have 34886 1727204496.73618: results queue empty 34886 1727204496.73621: checking for any_errors_fatal 34886 1727204496.73626: done checking for any_errors_fatal 34886 1727204496.73627: checking for max_fail_percentage 34886 1727204496.73628: done checking for max_fail_percentage 34886 1727204496.73629: checking to see if all hosts have failed and the running result is not ok 34886 1727204496.73630: done checking to see if all hosts have failed 34886 1727204496.73631: getting the remaining hosts for this loop 34886 1727204496.73633: done getting the remaining hosts for this loop 34886 1727204496.73636: getting the next task for host managed-node3 34886 1727204496.73644: done getting next task for host managed-node3 34886 1727204496.73648: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34886 1727204496.73651: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204496.73666: getting variables 34886 1727204496.73667: in VariableManager get_vars() 34886 1727204496.73709: Calling all_inventory to load vars for managed-node3 34886 1727204496.73712: Calling groups_inventory to load vars for managed-node3 34886 1727204496.73715: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204496.73727: Calling all_plugins_play to load vars for managed-node3 34886 1727204496.73730: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204496.73734: Calling groups_plugins_play to load vars for managed-node3 34886 1727204496.74935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204496.77829: done with get_vars() 34886 1727204496.77866: done getting variables 34886 1727204496.77992: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:36 -0400 (0:00:00.087) 0:00:14.948 ***** 34886 1727204496.78034: entering _queue_task() for managed-node3/dnf 34886 1727204496.78368: worker is 1 (out of 1 available) 34886 1727204496.78384: exiting _queue_task() for managed-node3/dnf 34886 1727204496.78600: done queuing things up, now waiting for results queue to drain 34886 1727204496.78603: waiting for pending results... 34886 1727204496.78813: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34886 1727204496.78909: in run() - task 12b410aa-8751-04b9-2e74-00000000001c 34886 1727204496.79026: variable 'ansible_search_path' from source: unknown 34886 1727204496.79030: variable 'ansible_search_path' from source: unknown 34886 1727204496.79034: calling self._execute() 34886 1727204496.79134: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204496.79148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204496.79165: variable 'omit' from source: magic vars 34886 1727204496.79897: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.79902: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204496.80250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204496.82179: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204496.82516: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204496.82549: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204496.82579: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204496.82605: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204496.82673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204496.82794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204496.82798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204496.82801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204496.82818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204496.82955: variable 'ansible_distribution' from source: facts 34886 1727204496.82966: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.82979: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 34886 1727204496.83123: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204496.83283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204496.83319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204496.83356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204496.83425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204496.83449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204496.83507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204496.83532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204496.83564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204496.83622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204496.83633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204496.83687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204496.83723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204496.83757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204496.83808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204496.83829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204496.84025: variable 'network_connections' from source: task vars 34886 1727204496.84044: variable 'interface' from source: play vars 34886 1727204496.84130: variable 'interface' from source: play vars 34886 1727204496.84227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204496.84432: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204496.84482: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204496.84526: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204496.84565: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204496.84623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204496.84655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204496.84702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204496.84795: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204496.84811: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204496.85133: variable 'network_connections' from source: task vars 34886 1727204496.85146: variable 'interface' from source: play vars 34886 1727204496.85229: variable 'interface' from source: play vars 34886 1727204496.85276: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34886 1727204496.85285: when evaluation is False, skipping this task 34886 1727204496.85296: _execute() done 34886 1727204496.85304: dumping result to json 34886 1727204496.85313: done dumping result, returning 34886 1727204496.85327: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-04b9-2e74-00000000001c] 34886 1727204496.85394: sending task result for task 12b410aa-8751-04b9-2e74-00000000001c 34886 1727204496.85696: done sending task result for task 12b410aa-8751-04b9-2e74-00000000001c 34886 1727204496.85699: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34886 1727204496.85753: no more pending results, returning what we have 34886 1727204496.85757: results queue empty 34886 1727204496.85758: checking for any_errors_fatal 34886 1727204496.85764: done checking for any_errors_fatal 34886 1727204496.85765: checking for max_fail_percentage 34886 1727204496.85767: done checking for max_fail_percentage 34886 1727204496.85768: checking to see if all hosts have failed and the running result is not ok 34886 1727204496.85770: done checking to see if all hosts have failed 34886 1727204496.85771: getting the remaining hosts for this loop 34886 1727204496.85772: done getting the remaining hosts for this loop 34886 1727204496.85777: getting the next task for host managed-node3 34886 1727204496.85783: done getting next task for host managed-node3 34886 1727204496.85787: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34886 1727204496.85793: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204496.85809: getting variables 34886 1727204496.85811: in VariableManager get_vars() 34886 1727204496.85859: Calling all_inventory to load vars for managed-node3 34886 1727204496.85863: Calling groups_inventory to load vars for managed-node3 34886 1727204496.85866: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204496.85877: Calling all_plugins_play to load vars for managed-node3 34886 1727204496.85882: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204496.85887: Calling groups_plugins_play to load vars for managed-node3 34886 1727204496.88294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204496.91149: done with get_vars() 34886 1727204496.91195: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34886 1727204496.91285: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:36 -0400 (0:00:00.132) 0:00:15.081 ***** 34886 1727204496.91326: entering _queue_task() for managed-node3/yum 34886 1727204496.91329: Creating lock for yum 34886 1727204496.91704: worker is 1 (out of 1 available) 34886 1727204496.91719: exiting _queue_task() for managed-node3/yum 34886 1727204496.91732: done queuing things up, now waiting for results queue to drain 34886 1727204496.91734: waiting for pending results... 34886 1727204496.92028: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34886 1727204496.92195: in run() - task 12b410aa-8751-04b9-2e74-00000000001d 34886 1727204496.92223: variable 'ansible_search_path' from source: unknown 34886 1727204496.92231: variable 'ansible_search_path' from source: unknown 34886 1727204496.92273: calling self._execute() 34886 1727204496.92376: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204496.92393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204496.92410: variable 'omit' from source: magic vars 34886 1727204496.92838: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.92858: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204496.93089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204496.95734: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204496.95836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204496.95885: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204496.95939: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204496.95978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204496.96077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204496.96124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204496.96163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204496.96228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204496.96257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204496.96385: variable 'ansible_distribution_major_version' from source: facts 34886 1727204496.96412: Evaluated conditional (ansible_distribution_major_version | int < 8): False 34886 1727204496.96454: when evaluation is False, skipping this task 34886 1727204496.96458: _execute() done 34886 1727204496.96461: dumping result to json 34886 1727204496.96463: done dumping result, returning 34886 1727204496.96466: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-04b9-2e74-00000000001d] 34886 1727204496.96471: sending task result for task 12b410aa-8751-04b9-2e74-00000000001d skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 34886 1727204496.96750: no more pending results, returning what we have 34886 1727204496.96755: results queue empty 34886 1727204496.96756: checking for any_errors_fatal 34886 1727204496.96764: done checking for any_errors_fatal 34886 1727204496.96765: checking for max_fail_percentage 34886 1727204496.96767: done checking for max_fail_percentage 34886 1727204496.96768: checking to see if all hosts have failed and the running result is not ok 34886 1727204496.96769: done checking to see if all hosts have failed 34886 1727204496.96770: getting the remaining hosts for this loop 34886 1727204496.96771: done getting the remaining hosts for this loop 34886 1727204496.96776: getting the next task for host managed-node3 34886 1727204496.96783: done getting next task for host managed-node3 34886 1727204496.96788: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34886 1727204496.96795: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204496.96906: getting variables 34886 1727204496.96908: in VariableManager get_vars() 34886 1727204496.96956: Calling all_inventory to load vars for managed-node3 34886 1727204496.96959: Calling groups_inventory to load vars for managed-node3 34886 1727204496.96962: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204496.96974: Calling all_plugins_play to load vars for managed-node3 34886 1727204496.96977: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204496.96981: Calling groups_plugins_play to load vars for managed-node3 34886 1727204496.97507: done sending task result for task 12b410aa-8751-04b9-2e74-00000000001d 34886 1727204496.97511: WORKER PROCESS EXITING 34886 1727204496.99241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204497.05833: done with get_vars() 34886 1727204497.05868: done getting variables 34886 1727204497.05929: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:37 -0400 (0:00:00.146) 0:00:15.227 ***** 34886 1727204497.05962: entering _queue_task() for managed-node3/fail 34886 1727204497.06313: worker is 1 (out of 1 available) 34886 1727204497.06329: exiting _queue_task() for managed-node3/fail 34886 1727204497.06342: done queuing things up, now waiting for results queue to drain 34886 1727204497.06345: waiting for pending results... 34886 1727204497.06582: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34886 1727204497.06790: in run() - task 12b410aa-8751-04b9-2e74-00000000001e 34886 1727204497.06810: variable 'ansible_search_path' from source: unknown 34886 1727204497.06815: variable 'ansible_search_path' from source: unknown 34886 1727204497.06863: calling self._execute() 34886 1727204497.06972: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204497.06982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204497.06997: variable 'omit' from source: magic vars 34886 1727204497.07681: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.07685: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204497.07825: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204497.08153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204497.10190: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204497.10256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204497.10286: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204497.10323: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204497.10345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204497.10417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.10444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.10467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.10503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.10517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.10561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.10584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.10607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.10639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.10653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.10689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.10712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.10734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.10766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.10779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.10944: variable 'network_connections' from source: task vars 34886 1727204497.11094: variable 'interface' from source: play vars 34886 1727204497.11098: variable 'interface' from source: play vars 34886 1727204497.11141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204497.11354: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204497.11410: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204497.11453: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204497.11502: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204497.11558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204497.11593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204497.11630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.11669: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204497.11742: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204497.12075: variable 'network_connections' from source: task vars 34886 1727204497.12091: variable 'interface' from source: play vars 34886 1727204497.12163: variable 'interface' from source: play vars 34886 1727204497.12193: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34886 1727204497.12198: when evaluation is False, skipping this task 34886 1727204497.12201: _execute() done 34886 1727204497.12204: dumping result to json 34886 1727204497.12209: done dumping result, returning 34886 1727204497.12217: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-04b9-2e74-00000000001e] 34886 1727204497.12246: sending task result for task 12b410aa-8751-04b9-2e74-00000000001e 34886 1727204497.12346: done sending task result for task 12b410aa-8751-04b9-2e74-00000000001e 34886 1727204497.12352: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34886 1727204497.12433: no more pending results, returning what we have 34886 1727204497.12437: results queue empty 34886 1727204497.12438: checking for any_errors_fatal 34886 1727204497.12446: done checking for any_errors_fatal 34886 1727204497.12447: checking for max_fail_percentage 34886 1727204497.12449: done checking for max_fail_percentage 34886 1727204497.12450: checking to see if all hosts have failed and the running result is not ok 34886 1727204497.12451: done checking to see if all hosts have failed 34886 1727204497.12452: getting the remaining hosts for this loop 34886 1727204497.12453: done getting the remaining hosts for this loop 34886 1727204497.12457: getting the next task for host managed-node3 34886 1727204497.12465: done getting next task for host managed-node3 34886 1727204497.12469: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34886 1727204497.12473: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204497.12488: getting variables 34886 1727204497.12491: in VariableManager get_vars() 34886 1727204497.12533: Calling all_inventory to load vars for managed-node3 34886 1727204497.12536: Calling groups_inventory to load vars for managed-node3 34886 1727204497.12539: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204497.12548: Calling all_plugins_play to load vars for managed-node3 34886 1727204497.12551: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204497.12554: Calling groups_plugins_play to load vars for managed-node3 34886 1727204497.13778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204497.16110: done with get_vars() 34886 1727204497.16133: done getting variables 34886 1727204497.16183: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:37 -0400 (0:00:00.102) 0:00:15.329 ***** 34886 1727204497.16213: entering _queue_task() for managed-node3/package 34886 1727204497.16442: worker is 1 (out of 1 available) 34886 1727204497.16456: exiting _queue_task() for managed-node3/package 34886 1727204497.16469: done queuing things up, now waiting for results queue to drain 34886 1727204497.16471: waiting for pending results... 34886 1727204497.16662: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 34886 1727204497.16761: in run() - task 12b410aa-8751-04b9-2e74-00000000001f 34886 1727204497.16774: variable 'ansible_search_path' from source: unknown 34886 1727204497.16778: variable 'ansible_search_path' from source: unknown 34886 1727204497.16815: calling self._execute() 34886 1727204497.16895: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204497.16902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204497.16917: variable 'omit' from source: magic vars 34886 1727204497.17241: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.17252: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204497.17695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204497.17846: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204497.17903: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204497.17995: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204497.18050: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204497.18196: variable 'network_packages' from source: role '' defaults 34886 1727204497.18346: variable '__network_provider_setup' from source: role '' defaults 34886 1727204497.18375: variable '__network_service_name_default_nm' from source: role '' defaults 34886 1727204497.18474: variable '__network_service_name_default_nm' from source: role '' defaults 34886 1727204497.18490: variable '__network_packages_default_nm' from source: role '' defaults 34886 1727204497.18576: variable '__network_packages_default_nm' from source: role '' defaults 34886 1727204497.18779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204497.21105: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204497.21157: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204497.21191: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204497.21219: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204497.21243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204497.21313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.21340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.21360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.21397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.21413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.21454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.21474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.21499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.21535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.21547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.21734: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34886 1727204497.21843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.21895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.22098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.22101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.22104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.22141: variable 'ansible_python' from source: facts 34886 1727204497.22176: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34886 1727204497.22296: variable '__network_wpa_supplicant_required' from source: role '' defaults 34886 1727204497.22414: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34886 1727204497.22623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.22705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.22859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.22864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.22891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.22947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.23020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.23023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.23053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.23066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.23194: variable 'network_connections' from source: task vars 34886 1727204497.23215: variable 'interface' from source: play vars 34886 1727204497.23291: variable 'interface' from source: play vars 34886 1727204497.23357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204497.23379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204497.23406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.23439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204497.23477: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204497.23748: variable 'network_connections' from source: task vars 34886 1727204497.23754: variable 'interface' from source: play vars 34886 1727204497.23817: variable 'interface' from source: play vars 34886 1727204497.23862: variable '__network_packages_default_wireless' from source: role '' defaults 34886 1727204497.23932: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204497.24183: variable 'network_connections' from source: task vars 34886 1727204497.24193: variable 'interface' from source: play vars 34886 1727204497.24251: variable 'interface' from source: play vars 34886 1727204497.24274: variable '__network_packages_default_team' from source: role '' defaults 34886 1727204497.24348: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204497.24604: variable 'network_connections' from source: task vars 34886 1727204497.24608: variable 'interface' from source: play vars 34886 1727204497.24668: variable 'interface' from source: play vars 34886 1727204497.24723: variable '__network_service_name_default_initscripts' from source: role '' defaults 34886 1727204497.24778: variable '__network_service_name_default_initscripts' from source: role '' defaults 34886 1727204497.24785: variable '__network_packages_default_initscripts' from source: role '' defaults 34886 1727204497.24841: variable '__network_packages_default_initscripts' from source: role '' defaults 34886 1727204497.25030: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34886 1727204497.25432: variable 'network_connections' from source: task vars 34886 1727204497.25436: variable 'interface' from source: play vars 34886 1727204497.25488: variable 'interface' from source: play vars 34886 1727204497.25501: variable 'ansible_distribution' from source: facts 34886 1727204497.25504: variable '__network_rh_distros' from source: role '' defaults 34886 1727204497.25512: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.25533: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34886 1727204497.25674: variable 'ansible_distribution' from source: facts 34886 1727204497.25678: variable '__network_rh_distros' from source: role '' defaults 34886 1727204497.25684: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.25692: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34886 1727204497.25839: variable 'ansible_distribution' from source: facts 34886 1727204497.25842: variable '__network_rh_distros' from source: role '' defaults 34886 1727204497.25845: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.25873: variable 'network_provider' from source: set_fact 34886 1727204497.25886: variable 'ansible_facts' from source: unknown 34886 1727204497.26471: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 34886 1727204497.26475: when evaluation is False, skipping this task 34886 1727204497.26478: _execute() done 34886 1727204497.26482: dumping result to json 34886 1727204497.26484: done dumping result, returning 34886 1727204497.26497: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-04b9-2e74-00000000001f] 34886 1727204497.26500: sending task result for task 12b410aa-8751-04b9-2e74-00000000001f 34886 1727204497.26603: done sending task result for task 12b410aa-8751-04b9-2e74-00000000001f 34886 1727204497.26606: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 34886 1727204497.26658: no more pending results, returning what we have 34886 1727204497.26662: results queue empty 34886 1727204497.26663: checking for any_errors_fatal 34886 1727204497.26671: done checking for any_errors_fatal 34886 1727204497.26672: checking for max_fail_percentage 34886 1727204497.26673: done checking for max_fail_percentage 34886 1727204497.26674: checking to see if all hosts have failed and the running result is not ok 34886 1727204497.26675: done checking to see if all hosts have failed 34886 1727204497.26676: getting the remaining hosts for this loop 34886 1727204497.26678: done getting the remaining hosts for this loop 34886 1727204497.26682: getting the next task for host managed-node3 34886 1727204497.26691: done getting next task for host managed-node3 34886 1727204497.26695: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34886 1727204497.26699: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204497.26716: getting variables 34886 1727204497.26718: in VariableManager get_vars() 34886 1727204497.26764: Calling all_inventory to load vars for managed-node3 34886 1727204497.26767: Calling groups_inventory to load vars for managed-node3 34886 1727204497.26770: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204497.26780: Calling all_plugins_play to load vars for managed-node3 34886 1727204497.26783: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204497.26787: Calling groups_plugins_play to load vars for managed-node3 34886 1727204497.28166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204497.29738: done with get_vars() 34886 1727204497.29760: done getting variables 34886 1727204497.29811: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:37 -0400 (0:00:00.136) 0:00:15.466 ***** 34886 1727204497.29840: entering _queue_task() for managed-node3/package 34886 1727204497.30090: worker is 1 (out of 1 available) 34886 1727204497.30104: exiting _queue_task() for managed-node3/package 34886 1727204497.30118: done queuing things up, now waiting for results queue to drain 34886 1727204497.30120: waiting for pending results... 34886 1727204497.30315: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34886 1727204497.30414: in run() - task 12b410aa-8751-04b9-2e74-000000000020 34886 1727204497.30434: variable 'ansible_search_path' from source: unknown 34886 1727204497.30438: variable 'ansible_search_path' from source: unknown 34886 1727204497.30470: calling self._execute() 34886 1727204497.30552: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204497.30562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204497.30571: variable 'omit' from source: magic vars 34886 1727204497.30890: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.30903: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204497.31006: variable 'network_state' from source: role '' defaults 34886 1727204497.31017: Evaluated conditional (network_state != {}): False 34886 1727204497.31020: when evaluation is False, skipping this task 34886 1727204497.31027: _execute() done 34886 1727204497.31030: dumping result to json 34886 1727204497.31035: done dumping result, returning 34886 1727204497.31044: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-04b9-2e74-000000000020] 34886 1727204497.31050: sending task result for task 12b410aa-8751-04b9-2e74-000000000020 34886 1727204497.31145: done sending task result for task 12b410aa-8751-04b9-2e74-000000000020 34886 1727204497.31148: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204497.31207: no more pending results, returning what we have 34886 1727204497.31212: results queue empty 34886 1727204497.31213: checking for any_errors_fatal 34886 1727204497.31220: done checking for any_errors_fatal 34886 1727204497.31221: checking for max_fail_percentage 34886 1727204497.31223: done checking for max_fail_percentage 34886 1727204497.31224: checking to see if all hosts have failed and the running result is not ok 34886 1727204497.31225: done checking to see if all hosts have failed 34886 1727204497.31226: getting the remaining hosts for this loop 34886 1727204497.31227: done getting the remaining hosts for this loop 34886 1727204497.31232: getting the next task for host managed-node3 34886 1727204497.31238: done getting next task for host managed-node3 34886 1727204497.31243: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34886 1727204497.31246: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204497.31268: getting variables 34886 1727204497.31270: in VariableManager get_vars() 34886 1727204497.31312: Calling all_inventory to load vars for managed-node3 34886 1727204497.31315: Calling groups_inventory to load vars for managed-node3 34886 1727204497.31318: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204497.31327: Calling all_plugins_play to load vars for managed-node3 34886 1727204497.31330: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204497.31334: Calling groups_plugins_play to load vars for managed-node3 34886 1727204497.32505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204497.34165: done with get_vars() 34886 1727204497.34186: done getting variables 34886 1727204497.34239: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:37 -0400 (0:00:00.044) 0:00:15.510 ***** 34886 1727204497.34265: entering _queue_task() for managed-node3/package 34886 1727204497.34502: worker is 1 (out of 1 available) 34886 1727204497.34516: exiting _queue_task() for managed-node3/package 34886 1727204497.34529: done queuing things up, now waiting for results queue to drain 34886 1727204497.34531: waiting for pending results... 34886 1727204497.34716: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34886 1727204497.34823: in run() - task 12b410aa-8751-04b9-2e74-000000000021 34886 1727204497.34838: variable 'ansible_search_path' from source: unknown 34886 1727204497.34842: variable 'ansible_search_path' from source: unknown 34886 1727204497.34879: calling self._execute() 34886 1727204497.34956: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204497.34962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204497.34974: variable 'omit' from source: magic vars 34886 1727204497.35285: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.35297: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204497.35401: variable 'network_state' from source: role '' defaults 34886 1727204497.35413: Evaluated conditional (network_state != {}): False 34886 1727204497.35416: when evaluation is False, skipping this task 34886 1727204497.35420: _execute() done 34886 1727204497.35423: dumping result to json 34886 1727204497.35434: done dumping result, returning 34886 1727204497.35438: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-04b9-2e74-000000000021] 34886 1727204497.35445: sending task result for task 12b410aa-8751-04b9-2e74-000000000021 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204497.35593: no more pending results, returning what we have 34886 1727204497.35597: results queue empty 34886 1727204497.35598: checking for any_errors_fatal 34886 1727204497.35607: done checking for any_errors_fatal 34886 1727204497.35607: checking for max_fail_percentage 34886 1727204497.35609: done checking for max_fail_percentage 34886 1727204497.35610: checking to see if all hosts have failed and the running result is not ok 34886 1727204497.35611: done checking to see if all hosts have failed 34886 1727204497.35612: getting the remaining hosts for this loop 34886 1727204497.35614: done getting the remaining hosts for this loop 34886 1727204497.35618: getting the next task for host managed-node3 34886 1727204497.35624: done getting next task for host managed-node3 34886 1727204497.35629: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34886 1727204497.35632: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204497.35648: getting variables 34886 1727204497.35650: in VariableManager get_vars() 34886 1727204497.35688: Calling all_inventory to load vars for managed-node3 34886 1727204497.35693: Calling groups_inventory to load vars for managed-node3 34886 1727204497.35696: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204497.35705: Calling all_plugins_play to load vars for managed-node3 34886 1727204497.35709: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204497.35712: Calling groups_plugins_play to load vars for managed-node3 34886 1727204497.36745: done sending task result for task 12b410aa-8751-04b9-2e74-000000000021 34886 1727204497.36749: WORKER PROCESS EXITING 34886 1727204497.36902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204497.38481: done with get_vars() 34886 1727204497.38504: done getting variables 34886 1727204497.38587: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:01:37 -0400 (0:00:00.043) 0:00:15.553 ***** 34886 1727204497.38615: entering _queue_task() for managed-node3/service 34886 1727204497.38617: Creating lock for service 34886 1727204497.38854: worker is 1 (out of 1 available) 34886 1727204497.38868: exiting _queue_task() for managed-node3/service 34886 1727204497.38882: done queuing things up, now waiting for results queue to drain 34886 1727204497.38884: waiting for pending results... 34886 1727204497.39059: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34886 1727204497.39165: in run() - task 12b410aa-8751-04b9-2e74-000000000022 34886 1727204497.39179: variable 'ansible_search_path' from source: unknown 34886 1727204497.39183: variable 'ansible_search_path' from source: unknown 34886 1727204497.39216: calling self._execute() 34886 1727204497.39295: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204497.39303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204497.39313: variable 'omit' from source: magic vars 34886 1727204497.39626: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.39635: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204497.39737: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204497.39909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204497.41629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204497.41961: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204497.41994: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204497.42026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204497.42048: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204497.42118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.42144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.42165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.42204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.42218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.42259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.42279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.42306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.42338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.42352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.42387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.42412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.42435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.42465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.42478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.42622: variable 'network_connections' from source: task vars 34886 1727204497.42633: variable 'interface' from source: play vars 34886 1727204497.42693: variable 'interface' from source: play vars 34886 1727204497.42759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204497.42893: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204497.42944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204497.42951: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204497.42979: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204497.43029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204497.43051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204497.43073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.43098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204497.43148: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204497.43352: variable 'network_connections' from source: task vars 34886 1727204497.43356: variable 'interface' from source: play vars 34886 1727204497.43413: variable 'interface' from source: play vars 34886 1727204497.43443: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34886 1727204497.43447: when evaluation is False, skipping this task 34886 1727204497.43450: _execute() done 34886 1727204497.43452: dumping result to json 34886 1727204497.43456: done dumping result, returning 34886 1727204497.43463: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-04b9-2e74-000000000022] 34886 1727204497.43470: sending task result for task 12b410aa-8751-04b9-2e74-000000000022 34886 1727204497.43567: done sending task result for task 12b410aa-8751-04b9-2e74-000000000022 34886 1727204497.43577: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34886 1727204497.43640: no more pending results, returning what we have 34886 1727204497.43644: results queue empty 34886 1727204497.43646: checking for any_errors_fatal 34886 1727204497.43653: done checking for any_errors_fatal 34886 1727204497.43654: checking for max_fail_percentage 34886 1727204497.43656: done checking for max_fail_percentage 34886 1727204497.43657: checking to see if all hosts have failed and the running result is not ok 34886 1727204497.43658: done checking to see if all hosts have failed 34886 1727204497.43658: getting the remaining hosts for this loop 34886 1727204497.43660: done getting the remaining hosts for this loop 34886 1727204497.43664: getting the next task for host managed-node3 34886 1727204497.43670: done getting next task for host managed-node3 34886 1727204497.43674: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34886 1727204497.43677: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204497.43695: getting variables 34886 1727204497.43698: in VariableManager get_vars() 34886 1727204497.43743: Calling all_inventory to load vars for managed-node3 34886 1727204497.43746: Calling groups_inventory to load vars for managed-node3 34886 1727204497.43748: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204497.43759: Calling all_plugins_play to load vars for managed-node3 34886 1727204497.43762: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204497.43765: Calling groups_plugins_play to load vars for managed-node3 34886 1727204497.45146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204497.46716: done with get_vars() 34886 1727204497.46747: done getting variables 34886 1727204497.46802: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:01:37 -0400 (0:00:00.082) 0:00:15.636 ***** 34886 1727204497.46830: entering _queue_task() for managed-node3/service 34886 1727204497.47095: worker is 1 (out of 1 available) 34886 1727204497.47111: exiting _queue_task() for managed-node3/service 34886 1727204497.47128: done queuing things up, now waiting for results queue to drain 34886 1727204497.47130: waiting for pending results... 34886 1727204497.47331: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34886 1727204497.47437: in run() - task 12b410aa-8751-04b9-2e74-000000000023 34886 1727204497.47453: variable 'ansible_search_path' from source: unknown 34886 1727204497.47458: variable 'ansible_search_path' from source: unknown 34886 1727204497.47493: calling self._execute() 34886 1727204497.47577: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204497.47585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204497.47598: variable 'omit' from source: magic vars 34886 1727204497.47916: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.47930: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204497.48073: variable 'network_provider' from source: set_fact 34886 1727204497.48077: variable 'network_state' from source: role '' defaults 34886 1727204497.48088: Evaluated conditional (network_provider == "nm" or network_state != {}): True 34886 1727204497.48097: variable 'omit' from source: magic vars 34886 1727204497.48147: variable 'omit' from source: magic vars 34886 1727204497.48174: variable 'network_service_name' from source: role '' defaults 34886 1727204497.48243: variable 'network_service_name' from source: role '' defaults 34886 1727204497.48341: variable '__network_provider_setup' from source: role '' defaults 34886 1727204497.48347: variable '__network_service_name_default_nm' from source: role '' defaults 34886 1727204497.48403: variable '__network_service_name_default_nm' from source: role '' defaults 34886 1727204497.48416: variable '__network_packages_default_nm' from source: role '' defaults 34886 1727204497.48469: variable '__network_packages_default_nm' from source: role '' defaults 34886 1727204497.48671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204497.50415: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204497.50481: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204497.50517: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204497.50551: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204497.50573: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204497.50648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.50672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.50694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.50734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.50747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.50787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.50810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.50837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.50868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.50881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.51079: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34886 1727204497.51184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.51205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.51228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.51264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.51275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.51352: variable 'ansible_python' from source: facts 34886 1727204497.51374: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34886 1727204497.51445: variable '__network_wpa_supplicant_required' from source: role '' defaults 34886 1727204497.51516: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34886 1727204497.51629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.51649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.51669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.51707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.51720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.51762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204497.51784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204497.51809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.51845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204497.51857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204497.51978: variable 'network_connections' from source: task vars 34886 1727204497.51986: variable 'interface' from source: play vars 34886 1727204497.52054: variable 'interface' from source: play vars 34886 1727204497.52148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204497.52304: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204497.52350: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204497.52391: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204497.52429: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204497.52483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204497.52510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204497.52539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204497.52568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204497.52612: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204497.52848: variable 'network_connections' from source: task vars 34886 1727204497.52855: variable 'interface' from source: play vars 34886 1727204497.52920: variable 'interface' from source: play vars 34886 1727204497.52963: variable '__network_packages_default_wireless' from source: role '' defaults 34886 1727204497.53034: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204497.53281: variable 'network_connections' from source: task vars 34886 1727204497.53284: variable 'interface' from source: play vars 34886 1727204497.53350: variable 'interface' from source: play vars 34886 1727204497.53372: variable '__network_packages_default_team' from source: role '' defaults 34886 1727204497.53441: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204497.53686: variable 'network_connections' from source: task vars 34886 1727204497.53691: variable 'interface' from source: play vars 34886 1727204497.53752: variable 'interface' from source: play vars 34886 1727204497.53807: variable '__network_service_name_default_initscripts' from source: role '' defaults 34886 1727204497.53859: variable '__network_service_name_default_initscripts' from source: role '' defaults 34886 1727204497.53869: variable '__network_packages_default_initscripts' from source: role '' defaults 34886 1727204497.53920: variable '__network_packages_default_initscripts' from source: role '' defaults 34886 1727204497.54107: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34886 1727204497.54530: variable 'network_connections' from source: task vars 34886 1727204497.54534: variable 'interface' from source: play vars 34886 1727204497.54586: variable 'interface' from source: play vars 34886 1727204497.54597: variable 'ansible_distribution' from source: facts 34886 1727204497.54601: variable '__network_rh_distros' from source: role '' defaults 34886 1727204497.54608: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.54633: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34886 1727204497.54780: variable 'ansible_distribution' from source: facts 34886 1727204497.54784: variable '__network_rh_distros' from source: role '' defaults 34886 1727204497.54792: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.54799: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34886 1727204497.54949: variable 'ansible_distribution' from source: facts 34886 1727204497.54954: variable '__network_rh_distros' from source: role '' defaults 34886 1727204497.54961: variable 'ansible_distribution_major_version' from source: facts 34886 1727204497.54993: variable 'network_provider' from source: set_fact 34886 1727204497.55013: variable 'omit' from source: magic vars 34886 1727204497.55039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204497.55063: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204497.55081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204497.55100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204497.55109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204497.55140: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204497.55143: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204497.55148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204497.55239: Set connection var ansible_timeout to 10 34886 1727204497.55245: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204497.55248: Set connection var ansible_connection to ssh 34886 1727204497.55255: Set connection var ansible_shell_executable to /bin/sh 34886 1727204497.55263: Set connection var ansible_pipelining to False 34886 1727204497.55266: Set connection var ansible_shell_type to sh 34886 1727204497.55291: variable 'ansible_shell_executable' from source: unknown 34886 1727204497.55298: variable 'ansible_connection' from source: unknown 34886 1727204497.55301: variable 'ansible_module_compression' from source: unknown 34886 1727204497.55305: variable 'ansible_shell_type' from source: unknown 34886 1727204497.55308: variable 'ansible_shell_executable' from source: unknown 34886 1727204497.55310: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204497.55316: variable 'ansible_pipelining' from source: unknown 34886 1727204497.55319: variable 'ansible_timeout' from source: unknown 34886 1727204497.55327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204497.55411: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204497.55426: variable 'omit' from source: magic vars 34886 1727204497.55434: starting attempt loop 34886 1727204497.55437: running the handler 34886 1727204497.55502: variable 'ansible_facts' from source: unknown 34886 1727204497.56214: _low_level_execute_command(): starting 34886 1727204497.56220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204497.56763: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204497.56767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204497.56771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204497.56774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204497.56839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204497.56842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204497.56850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204497.56884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204497.58688: stdout chunk (state=3): >>>/root <<< 34886 1727204497.58803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204497.58854: stderr chunk (state=3): >>><<< 34886 1727204497.58857: stdout chunk (state=3): >>><<< 34886 1727204497.58878: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204497.58887: _low_level_execute_command(): starting 34886 1727204497.58895: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589 `" && echo ansible-tmp-1727204497.588755-35951-183550405420589="` echo /root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589 `" ) && sleep 0' 34886 1727204497.59359: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204497.59363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204497.59366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204497.59368: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204497.59371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204497.59426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204497.59433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204497.59471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204497.61495: stdout chunk (state=3): >>>ansible-tmp-1727204497.588755-35951-183550405420589=/root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589 <<< 34886 1727204497.61624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204497.61677: stderr chunk (state=3): >>><<< 34886 1727204497.61680: stdout chunk (state=3): >>><<< 34886 1727204497.61693: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204497.588755-35951-183550405420589=/root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204497.61726: variable 'ansible_module_compression' from source: unknown 34886 1727204497.61774: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 34886 1727204497.61780: ANSIBALLZ: Acquiring lock 34886 1727204497.61783: ANSIBALLZ: Lock acquired: 139734986903328 34886 1727204497.61786: ANSIBALLZ: Creating module 34886 1727204497.87730: ANSIBALLZ: Writing module into payload 34886 1727204497.87873: ANSIBALLZ: Writing module 34886 1727204497.87904: ANSIBALLZ: Renaming module 34886 1727204497.87908: ANSIBALLZ: Done creating module 34886 1727204497.87946: variable 'ansible_facts' from source: unknown 34886 1727204497.88087: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/AnsiballZ_systemd.py 34886 1727204497.88217: Sending initial data 34886 1727204497.88224: Sent initial data (155 bytes) 34886 1727204497.88710: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204497.88717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204497.88720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204497.88722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204497.88774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204497.88779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204497.88833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204497.90584: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204497.90587: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204497.90617: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204497.90657: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpqf_e9muh /root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/AnsiballZ_systemd.py <<< 34886 1727204497.90662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/AnsiballZ_systemd.py" <<< 34886 1727204497.90691: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpqf_e9muh" to remote "/root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/AnsiballZ_systemd.py" <<< 34886 1727204497.92371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204497.92440: stderr chunk (state=3): >>><<< 34886 1727204497.92444: stdout chunk (state=3): >>><<< 34886 1727204497.92462: done transferring module to remote 34886 1727204497.92473: _low_level_execute_command(): starting 34886 1727204497.92478: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/ /root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/AnsiballZ_systemd.py && sleep 0' 34886 1727204497.92941: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204497.92946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204497.92949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204497.92952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204497.93005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204497.93012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204497.93048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204497.94924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204497.94968: stderr chunk (state=3): >>><<< 34886 1727204497.94972: stdout chunk (state=3): >>><<< 34886 1727204497.94984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204497.94987: _low_level_execute_command(): starting 34886 1727204497.94996: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/AnsiballZ_systemd.py && sleep 0' 34886 1727204497.95450: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204497.95454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204497.95457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204497.95459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204497.95510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204497.95517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204497.95563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204498.28443: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11857920", "MemoryAvailable": "infinity", "CPUUsageNSec": "1687060000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 34886 1727204498.28475: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service cloud-init.service network.service shutdown.target multi-user.target", "After": "systemd-journald.socket dbus-broker.service system.slice cloud-init-local.service basic.target dbus.socket sysinit.target network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "l<<< 34886 1727204498.28482: stdout chunk (state=3): >>>oaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:14 EDT", "StateChangeTimestampMonotonic": "499035810", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 34886 1727204498.30460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204498.30527: stderr chunk (state=3): >>><<< 34886 1727204498.30530: stdout chunk (state=3): >>><<< 34886 1727204498.30550: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11857920", "MemoryAvailable": "infinity", "CPUUsageNSec": "1687060000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service cloud-init.service network.service shutdown.target multi-user.target", "After": "systemd-journald.socket dbus-broker.service system.slice cloud-init-local.service basic.target dbus.socket sysinit.target network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:14 EDT", "StateChangeTimestampMonotonic": "499035810", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204498.30721: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204498.30741: _low_level_execute_command(): starting 34886 1727204498.30746: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204497.588755-35951-183550405420589/ > /dev/null 2>&1 && sleep 0' 34886 1727204498.31240: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204498.31244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204498.31248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.31251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204498.31253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.31310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204498.31317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204498.31352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204498.33282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204498.33337: stderr chunk (state=3): >>><<< 34886 1727204498.33342: stdout chunk (state=3): >>><<< 34886 1727204498.33355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204498.33364: handler run complete 34886 1727204498.33422: attempt loop complete, returning result 34886 1727204498.33426: _execute() done 34886 1727204498.33428: dumping result to json 34886 1727204498.33441: done dumping result, returning 34886 1727204498.33453: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-04b9-2e74-000000000023] 34886 1727204498.33459: sending task result for task 12b410aa-8751-04b9-2e74-000000000023 34886 1727204498.33736: done sending task result for task 12b410aa-8751-04b9-2e74-000000000023 34886 1727204498.33738: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204498.33794: no more pending results, returning what we have 34886 1727204498.33797: results queue empty 34886 1727204498.33798: checking for any_errors_fatal 34886 1727204498.33805: done checking for any_errors_fatal 34886 1727204498.33806: checking for max_fail_percentage 34886 1727204498.33807: done checking for max_fail_percentage 34886 1727204498.33808: checking to see if all hosts have failed and the running result is not ok 34886 1727204498.33809: done checking to see if all hosts have failed 34886 1727204498.33810: getting the remaining hosts for this loop 34886 1727204498.33812: done getting the remaining hosts for this loop 34886 1727204498.33816: getting the next task for host managed-node3 34886 1727204498.33824: done getting next task for host managed-node3 34886 1727204498.33828: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34886 1727204498.33832: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204498.33844: getting variables 34886 1727204498.33846: in VariableManager get_vars() 34886 1727204498.33887: Calling all_inventory to load vars for managed-node3 34886 1727204498.33892: Calling groups_inventory to load vars for managed-node3 34886 1727204498.33895: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204498.33905: Calling all_plugins_play to load vars for managed-node3 34886 1727204498.33908: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204498.33911: Calling groups_plugins_play to load vars for managed-node3 34886 1727204498.35253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204498.36830: done with get_vars() 34886 1727204498.36852: done getting variables 34886 1727204498.36904: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:01:38 -0400 (0:00:00.901) 0:00:16.537 ***** 34886 1727204498.36935: entering _queue_task() for managed-node3/service 34886 1727204498.37184: worker is 1 (out of 1 available) 34886 1727204498.37201: exiting _queue_task() for managed-node3/service 34886 1727204498.37215: done queuing things up, now waiting for results queue to drain 34886 1727204498.37217: waiting for pending results... 34886 1727204498.37404: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34886 1727204498.37505: in run() - task 12b410aa-8751-04b9-2e74-000000000024 34886 1727204498.37521: variable 'ansible_search_path' from source: unknown 34886 1727204498.37525: variable 'ansible_search_path' from source: unknown 34886 1727204498.37559: calling self._execute() 34886 1727204498.37639: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204498.37646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204498.37656: variable 'omit' from source: magic vars 34886 1727204498.37971: variable 'ansible_distribution_major_version' from source: facts 34886 1727204498.37982: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204498.38082: variable 'network_provider' from source: set_fact 34886 1727204498.38087: Evaluated conditional (network_provider == "nm"): True 34886 1727204498.38169: variable '__network_wpa_supplicant_required' from source: role '' defaults 34886 1727204498.38248: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34886 1727204498.38395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204498.40033: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204498.40090: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204498.40124: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204498.40152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204498.40179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204498.40255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204498.40282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204498.40308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204498.40341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204498.40355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204498.40400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204498.40422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204498.40441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204498.40472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204498.40487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204498.40597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204498.40601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204498.40604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204498.40607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204498.40609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204498.40716: variable 'network_connections' from source: task vars 34886 1727204498.40731: variable 'interface' from source: play vars 34886 1727204498.40788: variable 'interface' from source: play vars 34886 1727204498.40856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204498.40985: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204498.41018: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204498.41050: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204498.41076: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204498.41114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204498.41135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204498.41158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204498.41183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204498.41226: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204498.41428: variable 'network_connections' from source: task vars 34886 1727204498.41432: variable 'interface' from source: play vars 34886 1727204498.41484: variable 'interface' from source: play vars 34886 1727204498.41525: Evaluated conditional (__network_wpa_supplicant_required): False 34886 1727204498.41529: when evaluation is False, skipping this task 34886 1727204498.41532: _execute() done 34886 1727204498.41534: dumping result to json 34886 1727204498.41537: done dumping result, returning 34886 1727204498.41544: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-04b9-2e74-000000000024] 34886 1727204498.41556: sending task result for task 12b410aa-8751-04b9-2e74-000000000024 34886 1727204498.41647: done sending task result for task 12b410aa-8751-04b9-2e74-000000000024 34886 1727204498.41650: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 34886 1727204498.41735: no more pending results, returning what we have 34886 1727204498.41739: results queue empty 34886 1727204498.41740: checking for any_errors_fatal 34886 1727204498.41770: done checking for any_errors_fatal 34886 1727204498.41771: checking for max_fail_percentage 34886 1727204498.41774: done checking for max_fail_percentage 34886 1727204498.41775: checking to see if all hosts have failed and the running result is not ok 34886 1727204498.41776: done checking to see if all hosts have failed 34886 1727204498.41777: getting the remaining hosts for this loop 34886 1727204498.41778: done getting the remaining hosts for this loop 34886 1727204498.41781: getting the next task for host managed-node3 34886 1727204498.41787: done getting next task for host managed-node3 34886 1727204498.41792: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34886 1727204498.41795: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204498.41809: getting variables 34886 1727204498.41811: in VariableManager get_vars() 34886 1727204498.41855: Calling all_inventory to load vars for managed-node3 34886 1727204498.41858: Calling groups_inventory to load vars for managed-node3 34886 1727204498.41861: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204498.41876: Calling all_plugins_play to load vars for managed-node3 34886 1727204498.41879: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204498.41885: Calling groups_plugins_play to load vars for managed-node3 34886 1727204498.43086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204498.44672: done with get_vars() 34886 1727204498.44700: done getting variables 34886 1727204498.44750: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:01:38 -0400 (0:00:00.078) 0:00:16.615 ***** 34886 1727204498.44776: entering _queue_task() for managed-node3/service 34886 1727204498.45029: worker is 1 (out of 1 available) 34886 1727204498.45045: exiting _queue_task() for managed-node3/service 34886 1727204498.45059: done queuing things up, now waiting for results queue to drain 34886 1727204498.45061: waiting for pending results... 34886 1727204498.45244: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 34886 1727204498.45347: in run() - task 12b410aa-8751-04b9-2e74-000000000025 34886 1727204498.45360: variable 'ansible_search_path' from source: unknown 34886 1727204498.45364: variable 'ansible_search_path' from source: unknown 34886 1727204498.45400: calling self._execute() 34886 1727204498.45476: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204498.45482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204498.45493: variable 'omit' from source: magic vars 34886 1727204498.45802: variable 'ansible_distribution_major_version' from source: facts 34886 1727204498.45812: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204498.45914: variable 'network_provider' from source: set_fact 34886 1727204498.45921: Evaluated conditional (network_provider == "initscripts"): False 34886 1727204498.45926: when evaluation is False, skipping this task 34886 1727204498.45929: _execute() done 34886 1727204498.45932: dumping result to json 34886 1727204498.45934: done dumping result, returning 34886 1727204498.45946: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-04b9-2e74-000000000025] 34886 1727204498.45951: sending task result for task 12b410aa-8751-04b9-2e74-000000000025 34886 1727204498.46041: done sending task result for task 12b410aa-8751-04b9-2e74-000000000025 34886 1727204498.46044: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204498.46100: no more pending results, returning what we have 34886 1727204498.46104: results queue empty 34886 1727204498.46105: checking for any_errors_fatal 34886 1727204498.46117: done checking for any_errors_fatal 34886 1727204498.46118: checking for max_fail_percentage 34886 1727204498.46122: done checking for max_fail_percentage 34886 1727204498.46123: checking to see if all hosts have failed and the running result is not ok 34886 1727204498.46124: done checking to see if all hosts have failed 34886 1727204498.46125: getting the remaining hosts for this loop 34886 1727204498.46126: done getting the remaining hosts for this loop 34886 1727204498.46130: getting the next task for host managed-node3 34886 1727204498.46136: done getting next task for host managed-node3 34886 1727204498.46142: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34886 1727204498.46145: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204498.46162: getting variables 34886 1727204498.46163: in VariableManager get_vars() 34886 1727204498.46203: Calling all_inventory to load vars for managed-node3 34886 1727204498.46206: Calling groups_inventory to load vars for managed-node3 34886 1727204498.46208: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204498.46218: Calling all_plugins_play to load vars for managed-node3 34886 1727204498.46223: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204498.46227: Calling groups_plugins_play to load vars for managed-node3 34886 1727204498.47527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204498.49094: done with get_vars() 34886 1727204498.49116: done getting variables 34886 1727204498.49168: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:01:38 -0400 (0:00:00.044) 0:00:16.659 ***** 34886 1727204498.49199: entering _queue_task() for managed-node3/copy 34886 1727204498.49452: worker is 1 (out of 1 available) 34886 1727204498.49469: exiting _queue_task() for managed-node3/copy 34886 1727204498.49482: done queuing things up, now waiting for results queue to drain 34886 1727204498.49484: waiting for pending results... 34886 1727204498.49665: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34886 1727204498.49761: in run() - task 12b410aa-8751-04b9-2e74-000000000026 34886 1727204498.49774: variable 'ansible_search_path' from source: unknown 34886 1727204498.49778: variable 'ansible_search_path' from source: unknown 34886 1727204498.49815: calling self._execute() 34886 1727204498.49897: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204498.49904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204498.49915: variable 'omit' from source: magic vars 34886 1727204498.50234: variable 'ansible_distribution_major_version' from source: facts 34886 1727204498.50244: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204498.50344: variable 'network_provider' from source: set_fact 34886 1727204498.50348: Evaluated conditional (network_provider == "initscripts"): False 34886 1727204498.50354: when evaluation is False, skipping this task 34886 1727204498.50357: _execute() done 34886 1727204498.50362: dumping result to json 34886 1727204498.50367: done dumping result, returning 34886 1727204498.50379: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-04b9-2e74-000000000026] 34886 1727204498.50382: sending task result for task 12b410aa-8751-04b9-2e74-000000000026 34886 1727204498.50481: done sending task result for task 12b410aa-8751-04b9-2e74-000000000026 34886 1727204498.50484: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 34886 1727204498.50541: no more pending results, returning what we have 34886 1727204498.50547: results queue empty 34886 1727204498.50548: checking for any_errors_fatal 34886 1727204498.50555: done checking for any_errors_fatal 34886 1727204498.50556: checking for max_fail_percentage 34886 1727204498.50558: done checking for max_fail_percentage 34886 1727204498.50559: checking to see if all hosts have failed and the running result is not ok 34886 1727204498.50560: done checking to see if all hosts have failed 34886 1727204498.50561: getting the remaining hosts for this loop 34886 1727204498.50562: done getting the remaining hosts for this loop 34886 1727204498.50566: getting the next task for host managed-node3 34886 1727204498.50572: done getting next task for host managed-node3 34886 1727204498.50577: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34886 1727204498.50580: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204498.50605: getting variables 34886 1727204498.50607: in VariableManager get_vars() 34886 1727204498.50648: Calling all_inventory to load vars for managed-node3 34886 1727204498.50651: Calling groups_inventory to load vars for managed-node3 34886 1727204498.50654: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204498.50664: Calling all_plugins_play to load vars for managed-node3 34886 1727204498.50667: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204498.50671: Calling groups_plugins_play to load vars for managed-node3 34886 1727204498.51958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204498.53547: done with get_vars() 34886 1727204498.53570: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:01:38 -0400 (0:00:00.044) 0:00:16.704 ***** 34886 1727204498.53646: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 34886 1727204498.53648: Creating lock for fedora.linux_system_roles.network_connections 34886 1727204498.53901: worker is 1 (out of 1 available) 34886 1727204498.53917: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 34886 1727204498.53934: done queuing things up, now waiting for results queue to drain 34886 1727204498.53936: waiting for pending results... 34886 1727204498.54122: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34886 1727204498.54215: in run() - task 12b410aa-8751-04b9-2e74-000000000027 34886 1727204498.54230: variable 'ansible_search_path' from source: unknown 34886 1727204498.54233: variable 'ansible_search_path' from source: unknown 34886 1727204498.54269: calling self._execute() 34886 1727204498.54352: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204498.54358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204498.54368: variable 'omit' from source: magic vars 34886 1727204498.54684: variable 'ansible_distribution_major_version' from source: facts 34886 1727204498.54695: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204498.54702: variable 'omit' from source: magic vars 34886 1727204498.54752: variable 'omit' from source: magic vars 34886 1727204498.54893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204498.56552: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204498.56609: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204498.56640: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204498.56673: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204498.56697: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204498.56763: variable 'network_provider' from source: set_fact 34886 1727204498.56873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204498.56914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204498.56937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204498.56969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204498.56981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204498.57047: variable 'omit' from source: magic vars 34886 1727204498.57141: variable 'omit' from source: magic vars 34886 1727204498.57231: variable 'network_connections' from source: task vars 34886 1727204498.57241: variable 'interface' from source: play vars 34886 1727204498.57297: variable 'interface' from source: play vars 34886 1727204498.57433: variable 'omit' from source: magic vars 34886 1727204498.57443: variable '__lsr_ansible_managed' from source: task vars 34886 1727204498.57494: variable '__lsr_ansible_managed' from source: task vars 34886 1727204498.57724: Loaded config def from plugin (lookup/template) 34886 1727204498.57728: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 34886 1727204498.57750: File lookup term: get_ansible_managed.j2 34886 1727204498.57754: variable 'ansible_search_path' from source: unknown 34886 1727204498.57760: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 34886 1727204498.57776: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 34886 1727204498.57793: variable 'ansible_search_path' from source: unknown 34886 1727204498.63129: variable 'ansible_managed' from source: unknown 34886 1727204498.63258: variable 'omit' from source: magic vars 34886 1727204498.63283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204498.63311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204498.63328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204498.63344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204498.63354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204498.63378: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204498.63381: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204498.63386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204498.63472: Set connection var ansible_timeout to 10 34886 1727204498.63478: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204498.63481: Set connection var ansible_connection to ssh 34886 1727204498.63488: Set connection var ansible_shell_executable to /bin/sh 34886 1727204498.63498: Set connection var ansible_pipelining to False 34886 1727204498.63501: Set connection var ansible_shell_type to sh 34886 1727204498.63529: variable 'ansible_shell_executable' from source: unknown 34886 1727204498.63534: variable 'ansible_connection' from source: unknown 34886 1727204498.63537: variable 'ansible_module_compression' from source: unknown 34886 1727204498.63539: variable 'ansible_shell_type' from source: unknown 34886 1727204498.63542: variable 'ansible_shell_executable' from source: unknown 34886 1727204498.63544: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204498.63546: variable 'ansible_pipelining' from source: unknown 34886 1727204498.63553: variable 'ansible_timeout' from source: unknown 34886 1727204498.63556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204498.63667: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204498.63678: variable 'omit' from source: magic vars 34886 1727204498.63685: starting attempt loop 34886 1727204498.63688: running the handler 34886 1727204498.63705: _low_level_execute_command(): starting 34886 1727204498.63711: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204498.64254: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204498.64258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204498.64260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204498.64263: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204498.64265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.64325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204498.64328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204498.64332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204498.64378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204498.66145: stdout chunk (state=3): >>>/root <<< 34886 1727204498.66256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204498.66315: stderr chunk (state=3): >>><<< 34886 1727204498.66318: stdout chunk (state=3): >>><<< 34886 1727204498.66343: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204498.66356: _low_level_execute_command(): starting 34886 1727204498.66363: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244 `" && echo ansible-tmp-1727204498.6634355-35969-210614776898244="` echo /root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244 `" ) && sleep 0' 34886 1727204498.66833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204498.66836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204498.66839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204498.66842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.66906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204498.66909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204498.66941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204498.68953: stdout chunk (state=3): >>>ansible-tmp-1727204498.6634355-35969-210614776898244=/root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244 <<< 34886 1727204498.69073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204498.69120: stderr chunk (state=3): >>><<< 34886 1727204498.69126: stdout chunk (state=3): >>><<< 34886 1727204498.69141: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204498.6634355-35969-210614776898244=/root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204498.69185: variable 'ansible_module_compression' from source: unknown 34886 1727204498.69229: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 34886 1727204498.69233: ANSIBALLZ: Acquiring lock 34886 1727204498.69236: ANSIBALLZ: Lock acquired: 139734986798144 34886 1727204498.69241: ANSIBALLZ: Creating module 34886 1727204498.91946: ANSIBALLZ: Writing module into payload 34886 1727204498.92487: ANSIBALLZ: Writing module 34886 1727204498.92493: ANSIBALLZ: Renaming module 34886 1727204498.92496: ANSIBALLZ: Done creating module 34886 1727204498.92514: variable 'ansible_facts' from source: unknown 34886 1727204498.92648: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/AnsiballZ_network_connections.py 34886 1727204498.92844: Sending initial data 34886 1727204498.92958: Sent initial data (168 bytes) 34886 1727204498.93483: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204498.93503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204498.93624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.93656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204498.93673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204498.93697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204498.93759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204498.95479: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204498.95496: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204498.95517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204498.95552: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp5mhyky_d /root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/AnsiballZ_network_connections.py <<< 34886 1727204498.95556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/AnsiballZ_network_connections.py" <<< 34886 1727204498.95585: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp5mhyky_d" to remote "/root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/AnsiballZ_network_connections.py" <<< 34886 1727204498.96733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204498.96796: stderr chunk (state=3): >>><<< 34886 1727204498.96800: stdout chunk (state=3): >>><<< 34886 1727204498.96825: done transferring module to remote 34886 1727204498.96832: _low_level_execute_command(): starting 34886 1727204498.96841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/ /root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/AnsiballZ_network_connections.py && sleep 0' 34886 1727204498.97314: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204498.97318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.97320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204498.97323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204498.97325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.97388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204498.97440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204498.97484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204498.99299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204498.99348: stderr chunk (state=3): >>><<< 34886 1727204498.99351: stdout chunk (state=3): >>><<< 34886 1727204498.99367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204498.99370: _low_level_execute_command(): starting 34886 1727204498.99376: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/AnsiballZ_network_connections.py && sleep 0' 34886 1727204498.99827: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204498.99830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.99833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204498.99838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204498.99840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204498.99895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204498.99902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204498.99942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204501.28609: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 34886 1727204501.31045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204501.31049: stdout chunk (state=3): >>><<< 34886 1727204501.31051: stderr chunk (state=3): >>><<< 34886 1727204501.31095: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204501.31153: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204501.31175: _low_level_execute_command(): starting 34886 1727204501.31428: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204498.6634355-35969-210614776898244/ > /dev/null 2>&1 && sleep 0' 34886 1727204501.32195: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204501.32199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204501.32202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204501.32208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204501.32225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204501.32245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204501.32316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204501.34372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204501.34385: stdout chunk (state=3): >>><<< 34886 1727204501.34401: stderr chunk (state=3): >>><<< 34886 1727204501.34427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204501.34446: handler run complete 34886 1727204501.34502: attempt loop complete, returning result 34886 1727204501.34512: _execute() done 34886 1727204501.34524: dumping result to json 34886 1727204501.34538: done dumping result, returning 34886 1727204501.34555: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-04b9-2e74-000000000027] 34886 1727204501.34565: sending task result for task 12b410aa-8751-04b9-2e74-000000000027 changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981 (not-active) 34886 1727204501.34863: no more pending results, returning what we have 34886 1727204501.34866: results queue empty 34886 1727204501.34868: checking for any_errors_fatal 34886 1727204501.34874: done checking for any_errors_fatal 34886 1727204501.34875: checking for max_fail_percentage 34886 1727204501.34877: done checking for max_fail_percentage 34886 1727204501.34878: checking to see if all hosts have failed and the running result is not ok 34886 1727204501.34879: done checking to see if all hosts have failed 34886 1727204501.34879: getting the remaining hosts for this loop 34886 1727204501.34881: done getting the remaining hosts for this loop 34886 1727204501.34885: getting the next task for host managed-node3 34886 1727204501.34899: done getting next task for host managed-node3 34886 1727204501.34904: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34886 1727204501.34907: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204501.34922: getting variables 34886 1727204501.34924: in VariableManager get_vars() 34886 1727204501.34969: Calling all_inventory to load vars for managed-node3 34886 1727204501.34973: Calling groups_inventory to load vars for managed-node3 34886 1727204501.34976: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204501.34987: Calling all_plugins_play to load vars for managed-node3 34886 1727204501.34992: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204501.34997: Calling groups_plugins_play to load vars for managed-node3 34886 1727204501.35633: done sending task result for task 12b410aa-8751-04b9-2e74-000000000027 34886 1727204501.35637: WORKER PROCESS EXITING 34886 1727204501.37300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204501.40253: done with get_vars() 34886 1727204501.40300: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:01:41 -0400 (0:00:02.867) 0:00:19.571 ***** 34886 1727204501.40411: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 34886 1727204501.40413: Creating lock for fedora.linux_system_roles.network_state 34886 1727204501.41023: worker is 1 (out of 1 available) 34886 1727204501.41035: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 34886 1727204501.41047: done queuing things up, now waiting for results queue to drain 34886 1727204501.41049: waiting for pending results... 34886 1727204501.41137: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 34886 1727204501.41315: in run() - task 12b410aa-8751-04b9-2e74-000000000028 34886 1727204501.41343: variable 'ansible_search_path' from source: unknown 34886 1727204501.41353: variable 'ansible_search_path' from source: unknown 34886 1727204501.41404: calling self._execute() 34886 1727204501.41516: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.41534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.41553: variable 'omit' from source: magic vars 34886 1727204501.42044: variable 'ansible_distribution_major_version' from source: facts 34886 1727204501.42048: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204501.42204: variable 'network_state' from source: role '' defaults 34886 1727204501.42226: Evaluated conditional (network_state != {}): False 34886 1727204501.42235: when evaluation is False, skipping this task 34886 1727204501.42243: _execute() done 34886 1727204501.42261: dumping result to json 34886 1727204501.42265: done dumping result, returning 34886 1727204501.42294: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-04b9-2e74-000000000028] 34886 1727204501.42298: sending task result for task 12b410aa-8751-04b9-2e74-000000000028 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204501.42592: no more pending results, returning what we have 34886 1727204501.42598: results queue empty 34886 1727204501.42599: checking for any_errors_fatal 34886 1727204501.42614: done checking for any_errors_fatal 34886 1727204501.42615: checking for max_fail_percentage 34886 1727204501.42617: done checking for max_fail_percentage 34886 1727204501.42618: checking to see if all hosts have failed and the running result is not ok 34886 1727204501.42622: done checking to see if all hosts have failed 34886 1727204501.42623: getting the remaining hosts for this loop 34886 1727204501.42625: done getting the remaining hosts for this loop 34886 1727204501.42631: getting the next task for host managed-node3 34886 1727204501.42638: done getting next task for host managed-node3 34886 1727204501.42642: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34886 1727204501.42647: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204501.42666: getting variables 34886 1727204501.42668: in VariableManager get_vars() 34886 1727204501.42899: Calling all_inventory to load vars for managed-node3 34886 1727204501.42903: Calling groups_inventory to load vars for managed-node3 34886 1727204501.42905: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204501.42911: done sending task result for task 12b410aa-8751-04b9-2e74-000000000028 34886 1727204501.42914: WORKER PROCESS EXITING 34886 1727204501.42925: Calling all_plugins_play to load vars for managed-node3 34886 1727204501.42929: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204501.42932: Calling groups_plugins_play to load vars for managed-node3 34886 1727204501.45268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204501.48246: done with get_vars() 34886 1727204501.48285: done getting variables 34886 1727204501.48362: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:01:41 -0400 (0:00:00.079) 0:00:19.651 ***** 34886 1727204501.48407: entering _queue_task() for managed-node3/debug 34886 1727204501.48786: worker is 1 (out of 1 available) 34886 1727204501.49001: exiting _queue_task() for managed-node3/debug 34886 1727204501.49013: done queuing things up, now waiting for results queue to drain 34886 1727204501.49015: waiting for pending results... 34886 1727204501.49124: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34886 1727204501.49303: in run() - task 12b410aa-8751-04b9-2e74-000000000029 34886 1727204501.49356: variable 'ansible_search_path' from source: unknown 34886 1727204501.49360: variable 'ansible_search_path' from source: unknown 34886 1727204501.49391: calling self._execute() 34886 1727204501.49573: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.49578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.49581: variable 'omit' from source: magic vars 34886 1727204501.49993: variable 'ansible_distribution_major_version' from source: facts 34886 1727204501.50018: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204501.50035: variable 'omit' from source: magic vars 34886 1727204501.50110: variable 'omit' from source: magic vars 34886 1727204501.50168: variable 'omit' from source: magic vars 34886 1727204501.50226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204501.50275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204501.50306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204501.50342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204501.50451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204501.50455: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204501.50458: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.50460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.50554: Set connection var ansible_timeout to 10 34886 1727204501.50571: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204501.50580: Set connection var ansible_connection to ssh 34886 1727204501.50594: Set connection var ansible_shell_executable to /bin/sh 34886 1727204501.50611: Set connection var ansible_pipelining to False 34886 1727204501.50622: Set connection var ansible_shell_type to sh 34886 1727204501.50652: variable 'ansible_shell_executable' from source: unknown 34886 1727204501.50669: variable 'ansible_connection' from source: unknown 34886 1727204501.50672: variable 'ansible_module_compression' from source: unknown 34886 1727204501.50674: variable 'ansible_shell_type' from source: unknown 34886 1727204501.50779: variable 'ansible_shell_executable' from source: unknown 34886 1727204501.50783: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.50786: variable 'ansible_pipelining' from source: unknown 34886 1727204501.50788: variable 'ansible_timeout' from source: unknown 34886 1727204501.50793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.50895: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204501.50913: variable 'omit' from source: magic vars 34886 1727204501.50925: starting attempt loop 34886 1727204501.50933: running the handler 34886 1727204501.51074: variable '__network_connections_result' from source: set_fact 34886 1727204501.51150: handler run complete 34886 1727204501.51182: attempt loop complete, returning result 34886 1727204501.51195: _execute() done 34886 1727204501.51203: dumping result to json 34886 1727204501.51216: done dumping result, returning 34886 1727204501.51234: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-04b9-2e74-000000000029] 34886 1727204501.51245: sending task result for task 12b410aa-8751-04b9-2e74-000000000029 ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981 (not-active)" ] } 34886 1727204501.51501: no more pending results, returning what we have 34886 1727204501.51505: results queue empty 34886 1727204501.51507: checking for any_errors_fatal 34886 1727204501.51517: done checking for any_errors_fatal 34886 1727204501.51518: checking for max_fail_percentage 34886 1727204501.51522: done checking for max_fail_percentage 34886 1727204501.51523: checking to see if all hosts have failed and the running result is not ok 34886 1727204501.51524: done checking to see if all hosts have failed 34886 1727204501.51525: getting the remaining hosts for this loop 34886 1727204501.51527: done getting the remaining hosts for this loop 34886 1727204501.51532: getting the next task for host managed-node3 34886 1727204501.51538: done getting next task for host managed-node3 34886 1727204501.51543: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34886 1727204501.51548: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204501.51562: getting variables 34886 1727204501.51564: in VariableManager get_vars() 34886 1727204501.51858: Calling all_inventory to load vars for managed-node3 34886 1727204501.51862: Calling groups_inventory to load vars for managed-node3 34886 1727204501.51865: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204501.51872: done sending task result for task 12b410aa-8751-04b9-2e74-000000000029 34886 1727204501.51875: WORKER PROCESS EXITING 34886 1727204501.51885: Calling all_plugins_play to load vars for managed-node3 34886 1727204501.51892: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204501.51896: Calling groups_plugins_play to load vars for managed-node3 34886 1727204501.54210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204501.57171: done with get_vars() 34886 1727204501.57211: done getting variables 34886 1727204501.57285: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:01:41 -0400 (0:00:00.089) 0:00:19.741 ***** 34886 1727204501.57331: entering _queue_task() for managed-node3/debug 34886 1727204501.57739: worker is 1 (out of 1 available) 34886 1727204501.57754: exiting _queue_task() for managed-node3/debug 34886 1727204501.57767: done queuing things up, now waiting for results queue to drain 34886 1727204501.57769: waiting for pending results... 34886 1727204501.58044: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34886 1727204501.58205: in run() - task 12b410aa-8751-04b9-2e74-00000000002a 34886 1727204501.58237: variable 'ansible_search_path' from source: unknown 34886 1727204501.58247: variable 'ansible_search_path' from source: unknown 34886 1727204501.58293: calling self._execute() 34886 1727204501.58405: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.58424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.58448: variable 'omit' from source: magic vars 34886 1727204501.58910: variable 'ansible_distribution_major_version' from source: facts 34886 1727204501.58932: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204501.58945: variable 'omit' from source: magic vars 34886 1727204501.59025: variable 'omit' from source: magic vars 34886 1727204501.59076: variable 'omit' from source: magic vars 34886 1727204501.59133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204501.59178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204501.59211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204501.59241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204501.59261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204501.59307: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204501.59317: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.59331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.59468: Set connection var ansible_timeout to 10 34886 1727204501.59482: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204501.59526: Set connection var ansible_connection to ssh 34886 1727204501.59530: Set connection var ansible_shell_executable to /bin/sh 34886 1727204501.59532: Set connection var ansible_pipelining to False 34886 1727204501.59535: Set connection var ansible_shell_type to sh 34886 1727204501.59564: variable 'ansible_shell_executable' from source: unknown 34886 1727204501.59573: variable 'ansible_connection' from source: unknown 34886 1727204501.59582: variable 'ansible_module_compression' from source: unknown 34886 1727204501.59593: variable 'ansible_shell_type' from source: unknown 34886 1727204501.59635: variable 'ansible_shell_executable' from source: unknown 34886 1727204501.59638: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.59641: variable 'ansible_pipelining' from source: unknown 34886 1727204501.59643: variable 'ansible_timeout' from source: unknown 34886 1727204501.59646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.59814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204501.59838: variable 'omit' from source: magic vars 34886 1727204501.59895: starting attempt loop 34886 1727204501.59899: running the handler 34886 1727204501.59929: variable '__network_connections_result' from source: set_fact 34886 1727204501.60032: variable '__network_connections_result' from source: set_fact 34886 1727204501.60205: handler run complete 34886 1727204501.60254: attempt loop complete, returning result 34886 1727204501.60263: _execute() done 34886 1727204501.60271: dumping result to json 34886 1727204501.60494: done dumping result, returning 34886 1727204501.60499: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-04b9-2e74-00000000002a] 34886 1727204501.60501: sending task result for task 12b410aa-8751-04b9-2e74-00000000002a 34886 1727204501.60579: done sending task result for task 12b410aa-8751-04b9-2e74-00000000002a 34886 1727204501.60582: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 29a45e80-a1b1-4083-9f57-453b97dfb981 (not-active)" ] } } 34886 1727204501.60705: no more pending results, returning what we have 34886 1727204501.60709: results queue empty 34886 1727204501.60710: checking for any_errors_fatal 34886 1727204501.60722: done checking for any_errors_fatal 34886 1727204501.60723: checking for max_fail_percentage 34886 1727204501.60725: done checking for max_fail_percentage 34886 1727204501.60727: checking to see if all hosts have failed and the running result is not ok 34886 1727204501.60728: done checking to see if all hosts have failed 34886 1727204501.60729: getting the remaining hosts for this loop 34886 1727204501.60730: done getting the remaining hosts for this loop 34886 1727204501.60735: getting the next task for host managed-node3 34886 1727204501.60742: done getting next task for host managed-node3 34886 1727204501.60746: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34886 1727204501.60750: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204501.60764: getting variables 34886 1727204501.60766: in VariableManager get_vars() 34886 1727204501.60984: Calling all_inventory to load vars for managed-node3 34886 1727204501.60996: Calling groups_inventory to load vars for managed-node3 34886 1727204501.60999: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204501.61010: Calling all_plugins_play to load vars for managed-node3 34886 1727204501.61014: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204501.61018: Calling groups_plugins_play to load vars for managed-node3 34886 1727204501.63188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204501.66149: done with get_vars() 34886 1727204501.66188: done getting variables 34886 1727204501.66265: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:01:41 -0400 (0:00:00.089) 0:00:19.830 ***** 34886 1727204501.66308: entering _queue_task() for managed-node3/debug 34886 1727204501.66677: worker is 1 (out of 1 available) 34886 1727204501.66897: exiting _queue_task() for managed-node3/debug 34886 1727204501.66908: done queuing things up, now waiting for results queue to drain 34886 1727204501.66910: waiting for pending results... 34886 1727204501.67016: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34886 1727204501.67196: in run() - task 12b410aa-8751-04b9-2e74-00000000002b 34886 1727204501.67222: variable 'ansible_search_path' from source: unknown 34886 1727204501.67232: variable 'ansible_search_path' from source: unknown 34886 1727204501.67281: calling self._execute() 34886 1727204501.67465: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.67468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.67471: variable 'omit' from source: magic vars 34886 1727204501.67864: variable 'ansible_distribution_major_version' from source: facts 34886 1727204501.67884: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204501.68050: variable 'network_state' from source: role '' defaults 34886 1727204501.68068: Evaluated conditional (network_state != {}): False 34886 1727204501.68076: when evaluation is False, skipping this task 34886 1727204501.68085: _execute() done 34886 1727204501.68096: dumping result to json 34886 1727204501.68105: done dumping result, returning 34886 1727204501.68125: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-04b9-2e74-00000000002b] 34886 1727204501.68138: sending task result for task 12b410aa-8751-04b9-2e74-00000000002b skipping: [managed-node3] => { "false_condition": "network_state != {}" } 34886 1727204501.68301: no more pending results, returning what we have 34886 1727204501.68306: results queue empty 34886 1727204501.68307: checking for any_errors_fatal 34886 1727204501.68322: done checking for any_errors_fatal 34886 1727204501.68323: checking for max_fail_percentage 34886 1727204501.68325: done checking for max_fail_percentage 34886 1727204501.68327: checking to see if all hosts have failed and the running result is not ok 34886 1727204501.68328: done checking to see if all hosts have failed 34886 1727204501.68329: getting the remaining hosts for this loop 34886 1727204501.68330: done getting the remaining hosts for this loop 34886 1727204501.68335: getting the next task for host managed-node3 34886 1727204501.68343: done getting next task for host managed-node3 34886 1727204501.68348: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34886 1727204501.68354: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204501.68372: getting variables 34886 1727204501.68374: in VariableManager get_vars() 34886 1727204501.68428: Calling all_inventory to load vars for managed-node3 34886 1727204501.68432: Calling groups_inventory to load vars for managed-node3 34886 1727204501.68436: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204501.68450: Calling all_plugins_play to load vars for managed-node3 34886 1727204501.68454: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204501.68458: Calling groups_plugins_play to load vars for managed-node3 34886 1727204501.69297: done sending task result for task 12b410aa-8751-04b9-2e74-00000000002b 34886 1727204501.69300: WORKER PROCESS EXITING 34886 1727204501.71026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204501.73997: done with get_vars() 34886 1727204501.74043: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:01:41 -0400 (0:00:00.078) 0:00:19.909 ***** 34886 1727204501.74165: entering _queue_task() for managed-node3/ping 34886 1727204501.74168: Creating lock for ping 34886 1727204501.74568: worker is 1 (out of 1 available) 34886 1727204501.74584: exiting _queue_task() for managed-node3/ping 34886 1727204501.74700: done queuing things up, now waiting for results queue to drain 34886 1727204501.74703: waiting for pending results... 34886 1727204501.75010: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 34886 1727204501.75104: in run() - task 12b410aa-8751-04b9-2e74-00000000002c 34886 1727204501.75136: variable 'ansible_search_path' from source: unknown 34886 1727204501.75145: variable 'ansible_search_path' from source: unknown 34886 1727204501.75195: calling self._execute() 34886 1727204501.75347: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.75351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.75354: variable 'omit' from source: magic vars 34886 1727204501.75802: variable 'ansible_distribution_major_version' from source: facts 34886 1727204501.75825: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204501.75839: variable 'omit' from source: magic vars 34886 1727204501.75996: variable 'omit' from source: magic vars 34886 1727204501.76000: variable 'omit' from source: magic vars 34886 1727204501.76029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204501.76077: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204501.76112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204501.76143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204501.76164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204501.76206: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204501.76223: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.76233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.76375: Set connection var ansible_timeout to 10 34886 1727204501.76391: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204501.76401: Set connection var ansible_connection to ssh 34886 1727204501.76413: Set connection var ansible_shell_executable to /bin/sh 34886 1727204501.76595: Set connection var ansible_pipelining to False 34886 1727204501.76599: Set connection var ansible_shell_type to sh 34886 1727204501.76601: variable 'ansible_shell_executable' from source: unknown 34886 1727204501.76603: variable 'ansible_connection' from source: unknown 34886 1727204501.76606: variable 'ansible_module_compression' from source: unknown 34886 1727204501.76608: variable 'ansible_shell_type' from source: unknown 34886 1727204501.76610: variable 'ansible_shell_executable' from source: unknown 34886 1727204501.76612: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204501.76615: variable 'ansible_pipelining' from source: unknown 34886 1727204501.76617: variable 'ansible_timeout' from source: unknown 34886 1727204501.76622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204501.76788: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204501.76810: variable 'omit' from source: magic vars 34886 1727204501.76823: starting attempt loop 34886 1727204501.76832: running the handler 34886 1727204501.76859: _low_level_execute_command(): starting 34886 1727204501.76874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204501.77664: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204501.77679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204501.77700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204501.77729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204501.77843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 34886 1727204501.77862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204501.77888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204501.77967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204501.79716: stdout chunk (state=3): >>>/root <<< 34886 1727204501.79830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204501.79883: stderr chunk (state=3): >>><<< 34886 1727204501.79886: stdout chunk (state=3): >>><<< 34886 1727204501.79912: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204501.79927: _low_level_execute_command(): starting 34886 1727204501.79934: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590 `" && echo ansible-tmp-1727204501.7991178-36054-140755935905590="` echo /root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590 `" ) && sleep 0' 34886 1727204501.80384: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204501.80390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204501.80403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204501.80447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204501.80454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204501.80502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204501.82450: stdout chunk (state=3): >>>ansible-tmp-1727204501.7991178-36054-140755935905590=/root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590 <<< 34886 1727204501.82573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204501.82626: stderr chunk (state=3): >>><<< 34886 1727204501.82630: stdout chunk (state=3): >>><<< 34886 1727204501.82650: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204501.7991178-36054-140755935905590=/root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204501.82695: variable 'ansible_module_compression' from source: unknown 34886 1727204501.82736: ANSIBALLZ: Using lock for ping 34886 1727204501.82740: ANSIBALLZ: Acquiring lock 34886 1727204501.82742: ANSIBALLZ: Lock acquired: 139734982070960 34886 1727204501.82747: ANSIBALLZ: Creating module 34886 1727204501.98999: ANSIBALLZ: Writing module into payload 34886 1727204501.99004: ANSIBALLZ: Writing module 34886 1727204501.99006: ANSIBALLZ: Renaming module 34886 1727204501.99009: ANSIBALLZ: Done creating module 34886 1727204501.99017: variable 'ansible_facts' from source: unknown 34886 1727204501.99101: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/AnsiballZ_ping.py 34886 1727204501.99360: Sending initial data 34886 1727204501.99372: Sent initial data (153 bytes) 34886 1727204501.99941: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204501.99961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204502.00004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204502.00026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204502.00122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204502.00235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.00503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.02034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204502.02115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204502.02175: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpfcmeg_hu /root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/AnsiballZ_ping.py <<< 34886 1727204502.02179: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/AnsiballZ_ping.py" <<< 34886 1727204502.02228: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpfcmeg_hu" to remote "/root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/AnsiballZ_ping.py" <<< 34886 1727204502.03994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204502.03998: stdout chunk (state=3): >>><<< 34886 1727204502.04000: stderr chunk (state=3): >>><<< 34886 1727204502.04020: done transferring module to remote 34886 1727204502.04037: _low_level_execute_command(): starting 34886 1727204502.04058: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/ /root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/AnsiballZ_ping.py && sleep 0' 34886 1727204502.04688: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204502.04707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204502.04731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.04755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204502.04774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204502.04787: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204502.04804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.04822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34886 1727204502.04846: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204502.04953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 34886 1727204502.04971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204502.04991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.05068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.06929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204502.06950: stderr chunk (state=3): >>><<< 34886 1727204502.06954: stdout chunk (state=3): >>><<< 34886 1727204502.06970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204502.06973: _low_level_execute_command(): starting 34886 1727204502.06981: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/AnsiballZ_ping.py && sleep 0' 34886 1727204502.07425: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.07428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.07431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204502.07434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.07484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204502.07487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.07534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.24868: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 34886 1727204502.26326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204502.26385: stderr chunk (state=3): >>><<< 34886 1727204502.26393: stdout chunk (state=3): >>><<< 34886 1727204502.26410: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204502.26437: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204502.26447: _low_level_execute_command(): starting 34886 1727204502.26453: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204501.7991178-36054-140755935905590/ > /dev/null 2>&1 && sleep 0' 34886 1727204502.26952: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.26955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.26958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204502.26961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204502.26963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.27014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204502.27017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.27059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.28965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204502.29018: stderr chunk (state=3): >>><<< 34886 1727204502.29024: stdout chunk (state=3): >>><<< 34886 1727204502.29037: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204502.29044: handler run complete 34886 1727204502.29060: attempt loop complete, returning result 34886 1727204502.29063: _execute() done 34886 1727204502.29065: dumping result to json 34886 1727204502.29071: done dumping result, returning 34886 1727204502.29080: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-04b9-2e74-00000000002c] 34886 1727204502.29086: sending task result for task 12b410aa-8751-04b9-2e74-00000000002c 34886 1727204502.29194: done sending task result for task 12b410aa-8751-04b9-2e74-00000000002c 34886 1727204502.29197: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 34886 1727204502.29262: no more pending results, returning what we have 34886 1727204502.29266: results queue empty 34886 1727204502.29267: checking for any_errors_fatal 34886 1727204502.29274: done checking for any_errors_fatal 34886 1727204502.29275: checking for max_fail_percentage 34886 1727204502.29277: done checking for max_fail_percentage 34886 1727204502.29277: checking to see if all hosts have failed and the running result is not ok 34886 1727204502.29278: done checking to see if all hosts have failed 34886 1727204502.29279: getting the remaining hosts for this loop 34886 1727204502.29281: done getting the remaining hosts for this loop 34886 1727204502.29285: getting the next task for host managed-node3 34886 1727204502.29297: done getting next task for host managed-node3 34886 1727204502.29300: ^ task is: TASK: meta (role_complete) 34886 1727204502.29303: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204502.29316: getting variables 34886 1727204502.29318: in VariableManager get_vars() 34886 1727204502.29368: Calling all_inventory to load vars for managed-node3 34886 1727204502.29371: Calling groups_inventory to load vars for managed-node3 34886 1727204502.29374: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.29385: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.29388: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.29400: Calling groups_plugins_play to load vars for managed-node3 34886 1727204502.30774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204502.32349: done with get_vars() 34886 1727204502.32374: done getting variables 34886 1727204502.32450: done queuing things up, now waiting for results queue to drain 34886 1727204502.32452: results queue empty 34886 1727204502.32452: checking for any_errors_fatal 34886 1727204502.32455: done checking for any_errors_fatal 34886 1727204502.32455: checking for max_fail_percentage 34886 1727204502.32456: done checking for max_fail_percentage 34886 1727204502.32457: checking to see if all hosts have failed and the running result is not ok 34886 1727204502.32457: done checking to see if all hosts have failed 34886 1727204502.32458: getting the remaining hosts for this loop 34886 1727204502.32458: done getting the remaining hosts for this loop 34886 1727204502.32460: getting the next task for host managed-node3 34886 1727204502.32464: done getting next task for host managed-node3 34886 1727204502.32466: ^ task is: TASK: Include the task 'assert_device_present.yml' 34886 1727204502.32467: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204502.32469: getting variables 34886 1727204502.32469: in VariableManager get_vars() 34886 1727204502.32481: Calling all_inventory to load vars for managed-node3 34886 1727204502.32483: Calling groups_inventory to load vars for managed-node3 34886 1727204502.32485: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.32491: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.32493: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.32495: Calling groups_plugins_play to load vars for managed-node3 34886 1727204502.33573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204502.35147: done with get_vars() 34886 1727204502.35170: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Tuesday 24 September 2024 15:01:42 -0400 (0:00:00.610) 0:00:20.520 ***** 34886 1727204502.35239: entering _queue_task() for managed-node3/include_tasks 34886 1727204502.35518: worker is 1 (out of 1 available) 34886 1727204502.35536: exiting _queue_task() for managed-node3/include_tasks 34886 1727204502.35553: done queuing things up, now waiting for results queue to drain 34886 1727204502.35555: waiting for pending results... 34886 1727204502.35745: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 34886 1727204502.35831: in run() - task 12b410aa-8751-04b9-2e74-00000000005c 34886 1727204502.35840: variable 'ansible_search_path' from source: unknown 34886 1727204502.35873: calling self._execute() 34886 1727204502.35956: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.35962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.35973: variable 'omit' from source: magic vars 34886 1727204502.36294: variable 'ansible_distribution_major_version' from source: facts 34886 1727204502.36306: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204502.36313: _execute() done 34886 1727204502.36321: dumping result to json 34886 1727204502.36324: done dumping result, returning 34886 1727204502.36331: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [12b410aa-8751-04b9-2e74-00000000005c] 34886 1727204502.36337: sending task result for task 12b410aa-8751-04b9-2e74-00000000005c 34886 1727204502.36440: done sending task result for task 12b410aa-8751-04b9-2e74-00000000005c 34886 1727204502.36444: WORKER PROCESS EXITING 34886 1727204502.36473: no more pending results, returning what we have 34886 1727204502.36479: in VariableManager get_vars() 34886 1727204502.36545: Calling all_inventory to load vars for managed-node3 34886 1727204502.36549: Calling groups_inventory to load vars for managed-node3 34886 1727204502.36552: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.36564: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.36567: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.36570: Calling groups_plugins_play to load vars for managed-node3 34886 1727204502.41341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204502.42896: done with get_vars() 34886 1727204502.42924: variable 'ansible_search_path' from source: unknown 34886 1727204502.42936: we have included files to process 34886 1727204502.42937: generating all_blocks data 34886 1727204502.42938: done generating all_blocks data 34886 1727204502.42941: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34886 1727204502.42942: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34886 1727204502.42943: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34886 1727204502.43066: in VariableManager get_vars() 34886 1727204502.43085: done with get_vars() 34886 1727204502.43179: done processing included file 34886 1727204502.43180: iterating over new_blocks loaded from include file 34886 1727204502.43182: in VariableManager get_vars() 34886 1727204502.43197: done with get_vars() 34886 1727204502.43198: filtering new block on tags 34886 1727204502.43212: done filtering new block on tags 34886 1727204502.43214: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 34886 1727204502.43217: extending task lists for all hosts with included blocks 34886 1727204502.44837: done extending task lists 34886 1727204502.44838: done processing included files 34886 1727204502.44839: results queue empty 34886 1727204502.44840: checking for any_errors_fatal 34886 1727204502.44841: done checking for any_errors_fatal 34886 1727204502.44841: checking for max_fail_percentage 34886 1727204502.44842: done checking for max_fail_percentage 34886 1727204502.44843: checking to see if all hosts have failed and the running result is not ok 34886 1727204502.44844: done checking to see if all hosts have failed 34886 1727204502.44844: getting the remaining hosts for this loop 34886 1727204502.44845: done getting the remaining hosts for this loop 34886 1727204502.44847: getting the next task for host managed-node3 34886 1727204502.44850: done getting next task for host managed-node3 34886 1727204502.44851: ^ task is: TASK: Include the task 'get_interface_stat.yml' 34886 1727204502.44853: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204502.44855: getting variables 34886 1727204502.44856: in VariableManager get_vars() 34886 1727204502.44868: Calling all_inventory to load vars for managed-node3 34886 1727204502.44870: Calling groups_inventory to load vars for managed-node3 34886 1727204502.44873: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.44878: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.44880: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.44882: Calling groups_plugins_play to load vars for managed-node3 34886 1727204502.45995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204502.47577: done with get_vars() 34886 1727204502.47599: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:01:42 -0400 (0:00:00.124) 0:00:20.644 ***** 34886 1727204502.47661: entering _queue_task() for managed-node3/include_tasks 34886 1727204502.47948: worker is 1 (out of 1 available) 34886 1727204502.47964: exiting _queue_task() for managed-node3/include_tasks 34886 1727204502.47978: done queuing things up, now waiting for results queue to drain 34886 1727204502.47980: waiting for pending results... 34886 1727204502.48171: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 34886 1727204502.48245: in run() - task 12b410aa-8751-04b9-2e74-0000000002b5 34886 1727204502.48257: variable 'ansible_search_path' from source: unknown 34886 1727204502.48261: variable 'ansible_search_path' from source: unknown 34886 1727204502.48293: calling self._execute() 34886 1727204502.48378: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.48385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.48397: variable 'omit' from source: magic vars 34886 1727204502.48723: variable 'ansible_distribution_major_version' from source: facts 34886 1727204502.48732: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204502.48739: _execute() done 34886 1727204502.48742: dumping result to json 34886 1727204502.48748: done dumping result, returning 34886 1727204502.48755: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-04b9-2e74-0000000002b5] 34886 1727204502.48762: sending task result for task 12b410aa-8751-04b9-2e74-0000000002b5 34886 1727204502.48856: done sending task result for task 12b410aa-8751-04b9-2e74-0000000002b5 34886 1727204502.48861: WORKER PROCESS EXITING 34886 1727204502.48894: no more pending results, returning what we have 34886 1727204502.48899: in VariableManager get_vars() 34886 1727204502.48952: Calling all_inventory to load vars for managed-node3 34886 1727204502.48956: Calling groups_inventory to load vars for managed-node3 34886 1727204502.48959: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.48972: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.48975: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.48979: Calling groups_plugins_play to load vars for managed-node3 34886 1727204502.50202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204502.51775: done with get_vars() 34886 1727204502.51798: variable 'ansible_search_path' from source: unknown 34886 1727204502.51799: variable 'ansible_search_path' from source: unknown 34886 1727204502.51834: we have included files to process 34886 1727204502.51836: generating all_blocks data 34886 1727204502.51837: done generating all_blocks data 34886 1727204502.51838: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34886 1727204502.51839: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34886 1727204502.51841: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34886 1727204502.52034: done processing included file 34886 1727204502.52036: iterating over new_blocks loaded from include file 34886 1727204502.52038: in VariableManager get_vars() 34886 1727204502.52054: done with get_vars() 34886 1727204502.52056: filtering new block on tags 34886 1727204502.52068: done filtering new block on tags 34886 1727204502.52070: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 34886 1727204502.52074: extending task lists for all hosts with included blocks 34886 1727204502.52160: done extending task lists 34886 1727204502.52161: done processing included files 34886 1727204502.52161: results queue empty 34886 1727204502.52162: checking for any_errors_fatal 34886 1727204502.52165: done checking for any_errors_fatal 34886 1727204502.52166: checking for max_fail_percentage 34886 1727204502.52167: done checking for max_fail_percentage 34886 1727204502.52167: checking to see if all hosts have failed and the running result is not ok 34886 1727204502.52168: done checking to see if all hosts have failed 34886 1727204502.52168: getting the remaining hosts for this loop 34886 1727204502.52169: done getting the remaining hosts for this loop 34886 1727204502.52172: getting the next task for host managed-node3 34886 1727204502.52174: done getting next task for host managed-node3 34886 1727204502.52176: ^ task is: TASK: Get stat for interface {{ interface }} 34886 1727204502.52178: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204502.52180: getting variables 34886 1727204502.52181: in VariableManager get_vars() 34886 1727204502.52193: Calling all_inventory to load vars for managed-node3 34886 1727204502.52195: Calling groups_inventory to load vars for managed-node3 34886 1727204502.52197: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.52201: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.52203: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.52205: Calling groups_plugins_play to load vars for managed-node3 34886 1727204502.53379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204502.54943: done with get_vars() 34886 1727204502.54968: done getting variables 34886 1727204502.55109: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:01:42 -0400 (0:00:00.074) 0:00:20.719 ***** 34886 1727204502.55137: entering _queue_task() for managed-node3/stat 34886 1727204502.55422: worker is 1 (out of 1 available) 34886 1727204502.55438: exiting _queue_task() for managed-node3/stat 34886 1727204502.55453: done queuing things up, now waiting for results queue to drain 34886 1727204502.55455: waiting for pending results... 34886 1727204502.55646: running TaskExecutor() for managed-node3/TASK: Get stat for interface veth0 34886 1727204502.55796: in run() - task 12b410aa-8751-04b9-2e74-0000000003a0 34886 1727204502.55801: variable 'ansible_search_path' from source: unknown 34886 1727204502.55806: variable 'ansible_search_path' from source: unknown 34886 1727204502.55809: calling self._execute() 34886 1727204502.55862: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.55867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.55879: variable 'omit' from source: magic vars 34886 1727204502.56215: variable 'ansible_distribution_major_version' from source: facts 34886 1727204502.56231: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204502.56238: variable 'omit' from source: magic vars 34886 1727204502.56277: variable 'omit' from source: magic vars 34886 1727204502.56362: variable 'interface' from source: play vars 34886 1727204502.56378: variable 'omit' from source: magic vars 34886 1727204502.56417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204502.56452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204502.56472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204502.56488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204502.56501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204502.56531: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204502.56535: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.56538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.56629: Set connection var ansible_timeout to 10 34886 1727204502.56635: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204502.56638: Set connection var ansible_connection to ssh 34886 1727204502.56645: Set connection var ansible_shell_executable to /bin/sh 34886 1727204502.56655: Set connection var ansible_pipelining to False 34886 1727204502.56658: Set connection var ansible_shell_type to sh 34886 1727204502.56682: variable 'ansible_shell_executable' from source: unknown 34886 1727204502.56686: variable 'ansible_connection' from source: unknown 34886 1727204502.56689: variable 'ansible_module_compression' from source: unknown 34886 1727204502.56693: variable 'ansible_shell_type' from source: unknown 34886 1727204502.56696: variable 'ansible_shell_executable' from source: unknown 34886 1727204502.56703: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.56705: variable 'ansible_pipelining' from source: unknown 34886 1727204502.56710: variable 'ansible_timeout' from source: unknown 34886 1727204502.56715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.56900: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204502.56910: variable 'omit' from source: magic vars 34886 1727204502.56919: starting attempt loop 34886 1727204502.56922: running the handler 34886 1727204502.56939: _low_level_execute_command(): starting 34886 1727204502.56946: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204502.57472: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.57507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204502.57511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.57514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.57516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.57566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204502.57583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.57624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.59369: stdout chunk (state=3): >>>/root <<< 34886 1727204502.59503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204502.59542: stderr chunk (state=3): >>><<< 34886 1727204502.59545: stdout chunk (state=3): >>><<< 34886 1727204502.59565: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204502.59595: _low_level_execute_command(): starting 34886 1727204502.59600: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993 `" && echo ansible-tmp-1727204502.5957088-36077-115634890716993="` echo /root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993 `" ) && sleep 0' 34886 1727204502.60042: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.60046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.60050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.60060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.60117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204502.60121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.60153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.62120: stdout chunk (state=3): >>>ansible-tmp-1727204502.5957088-36077-115634890716993=/root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993 <<< 34886 1727204502.62239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204502.62284: stderr chunk (state=3): >>><<< 34886 1727204502.62287: stdout chunk (state=3): >>><<< 34886 1727204502.62307: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204502.5957088-36077-115634890716993=/root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204502.62352: variable 'ansible_module_compression' from source: unknown 34886 1727204502.62398: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 34886 1727204502.62432: variable 'ansible_facts' from source: unknown 34886 1727204502.62485: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/AnsiballZ_stat.py 34886 1727204502.62595: Sending initial data 34886 1727204502.62599: Sent initial data (153 bytes) 34886 1727204502.63063: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204502.63066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.63069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204502.63072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.63127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204502.63131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.63173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.64781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204502.64814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204502.64848: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp2utzwsk_ /root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/AnsiballZ_stat.py <<< 34886 1727204502.64852: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/AnsiballZ_stat.py" <<< 34886 1727204502.64879: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp2utzwsk_" to remote "/root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/AnsiballZ_stat.py" <<< 34886 1727204502.65638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204502.65710: stderr chunk (state=3): >>><<< 34886 1727204502.65714: stdout chunk (state=3): >>><<< 34886 1727204502.65736: done transferring module to remote 34886 1727204502.65747: _low_level_execute_command(): starting 34886 1727204502.65752: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/ /root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/AnsiballZ_stat.py && sleep 0' 34886 1727204502.66233: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.66237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204502.66239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.66242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204502.66245: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.66294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204502.66298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.66342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.68159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204502.68214: stderr chunk (state=3): >>><<< 34886 1727204502.68217: stdout chunk (state=3): >>><<< 34886 1727204502.68233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204502.68236: _low_level_execute_command(): starting 34886 1727204502.68242: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/AnsiballZ_stat.py && sleep 0' 34886 1727204502.68696: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.68700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204502.68703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204502.68705: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204502.68707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.68765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204502.68769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.68812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.86050: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38029, "dev": 23, "nlink": 1, "atime": 1727204490.8496943, "mtime": 1727204490.8496943, "ctime": 1727204490.8496943, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 34886 1727204502.87505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204502.87571: stderr chunk (state=3): >>><<< 34886 1727204502.87574: stdout chunk (state=3): >>><<< 34886 1727204502.87593: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38029, "dev": 23, "nlink": 1, "atime": 1727204490.8496943, "mtime": 1727204490.8496943, "ctime": 1727204490.8496943, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204502.87649: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204502.87659: _low_level_execute_command(): starting 34886 1727204502.87665: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204502.5957088-36077-115634890716993/ > /dev/null 2>&1 && sleep 0' 34886 1727204502.88159: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204502.88163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204502.88173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.88175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204502.88178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204502.88227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204502.88231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204502.88278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204502.90180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204502.90231: stderr chunk (state=3): >>><<< 34886 1727204502.90235: stdout chunk (state=3): >>><<< 34886 1727204502.90249: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204502.90256: handler run complete 34886 1727204502.90300: attempt loop complete, returning result 34886 1727204502.90307: _execute() done 34886 1727204502.90310: dumping result to json 34886 1727204502.90315: done dumping result, returning 34886 1727204502.90327: done running TaskExecutor() for managed-node3/TASK: Get stat for interface veth0 [12b410aa-8751-04b9-2e74-0000000003a0] 34886 1727204502.90338: sending task result for task 12b410aa-8751-04b9-2e74-0000000003a0 34886 1727204502.90448: done sending task result for task 12b410aa-8751-04b9-2e74-0000000003a0 34886 1727204502.90452: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204490.8496943, "block_size": 4096, "blocks": 0, "ctime": 1727204490.8496943, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 38029, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727204490.8496943, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 34886 1727204502.90564: no more pending results, returning what we have 34886 1727204502.90567: results queue empty 34886 1727204502.90569: checking for any_errors_fatal 34886 1727204502.90571: done checking for any_errors_fatal 34886 1727204502.90571: checking for max_fail_percentage 34886 1727204502.90573: done checking for max_fail_percentage 34886 1727204502.90574: checking to see if all hosts have failed and the running result is not ok 34886 1727204502.90575: done checking to see if all hosts have failed 34886 1727204502.90576: getting the remaining hosts for this loop 34886 1727204502.90578: done getting the remaining hosts for this loop 34886 1727204502.90582: getting the next task for host managed-node3 34886 1727204502.90592: done getting next task for host managed-node3 34886 1727204502.90596: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 34886 1727204502.90598: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204502.90603: getting variables 34886 1727204502.90605: in VariableManager get_vars() 34886 1727204502.90644: Calling all_inventory to load vars for managed-node3 34886 1727204502.90648: Calling groups_inventory to load vars for managed-node3 34886 1727204502.90650: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.90661: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.90665: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.90668: Calling groups_plugins_play to load vars for managed-node3 34886 1727204502.92030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204502.93609: done with get_vars() 34886 1727204502.93637: done getting variables 34886 1727204502.93723: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34886 1727204502.93824: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:01:42 -0400 (0:00:00.387) 0:00:21.106 ***** 34886 1727204502.93852: entering _queue_task() for managed-node3/assert 34886 1727204502.93854: Creating lock for assert 34886 1727204502.94116: worker is 1 (out of 1 available) 34886 1727204502.94135: exiting _queue_task() for managed-node3/assert 34886 1727204502.94148: done queuing things up, now waiting for results queue to drain 34886 1727204502.94150: waiting for pending results... 34886 1727204502.94342: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'veth0' 34886 1727204502.94426: in run() - task 12b410aa-8751-04b9-2e74-0000000002b6 34886 1727204502.94439: variable 'ansible_search_path' from source: unknown 34886 1727204502.94442: variable 'ansible_search_path' from source: unknown 34886 1727204502.94474: calling self._execute() 34886 1727204502.94560: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.94566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.94578: variable 'omit' from source: magic vars 34886 1727204502.94886: variable 'ansible_distribution_major_version' from source: facts 34886 1727204502.94899: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204502.94906: variable 'omit' from source: magic vars 34886 1727204502.94941: variable 'omit' from source: magic vars 34886 1727204502.95024: variable 'interface' from source: play vars 34886 1727204502.95040: variable 'omit' from source: magic vars 34886 1727204502.95076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204502.95112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204502.95133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204502.95152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204502.95164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204502.95196: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204502.95200: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.95203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.95292: Set connection var ansible_timeout to 10 34886 1727204502.95298: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204502.95302: Set connection var ansible_connection to ssh 34886 1727204502.95360: Set connection var ansible_shell_executable to /bin/sh 34886 1727204502.95363: Set connection var ansible_pipelining to False 34886 1727204502.95366: Set connection var ansible_shell_type to sh 34886 1727204502.95368: variable 'ansible_shell_executable' from source: unknown 34886 1727204502.95373: variable 'ansible_connection' from source: unknown 34886 1727204502.95375: variable 'ansible_module_compression' from source: unknown 34886 1727204502.95377: variable 'ansible_shell_type' from source: unknown 34886 1727204502.95380: variable 'ansible_shell_executable' from source: unknown 34886 1727204502.95382: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.95384: variable 'ansible_pipelining' from source: unknown 34886 1727204502.95386: variable 'ansible_timeout' from source: unknown 34886 1727204502.95394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.95482: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204502.95495: variable 'omit' from source: magic vars 34886 1727204502.95506: starting attempt loop 34886 1727204502.95509: running the handler 34886 1727204502.95618: variable 'interface_stat' from source: set_fact 34886 1727204502.95639: Evaluated conditional (interface_stat.stat.exists): True 34886 1727204502.95645: handler run complete 34886 1727204502.95659: attempt loop complete, returning result 34886 1727204502.95662: _execute() done 34886 1727204502.95665: dumping result to json 34886 1727204502.95670: done dumping result, returning 34886 1727204502.95677: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'veth0' [12b410aa-8751-04b9-2e74-0000000002b6] 34886 1727204502.95684: sending task result for task 12b410aa-8751-04b9-2e74-0000000002b6 34886 1727204502.95773: done sending task result for task 12b410aa-8751-04b9-2e74-0000000002b6 34886 1727204502.95776: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 34886 1727204502.95841: no more pending results, returning what we have 34886 1727204502.95845: results queue empty 34886 1727204502.95846: checking for any_errors_fatal 34886 1727204502.95855: done checking for any_errors_fatal 34886 1727204502.95856: checking for max_fail_percentage 34886 1727204502.95858: done checking for max_fail_percentage 34886 1727204502.95859: checking to see if all hosts have failed and the running result is not ok 34886 1727204502.95860: done checking to see if all hosts have failed 34886 1727204502.95861: getting the remaining hosts for this loop 34886 1727204502.95862: done getting the remaining hosts for this loop 34886 1727204502.95867: getting the next task for host managed-node3 34886 1727204502.95874: done getting next task for host managed-node3 34886 1727204502.95877: ^ task is: TASK: Include the task 'assert_profile_present.yml' 34886 1727204502.95879: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204502.95883: getting variables 34886 1727204502.95884: in VariableManager get_vars() 34886 1727204502.95927: Calling all_inventory to load vars for managed-node3 34886 1727204502.95930: Calling groups_inventory to load vars for managed-node3 34886 1727204502.95933: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.95944: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.95947: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.95952: Calling groups_plugins_play to load vars for managed-node3 34886 1727204502.97149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204502.98705: done with get_vars() 34886 1727204502.98729: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Tuesday 24 September 2024 15:01:42 -0400 (0:00:00.049) 0:00:21.155 ***** 34886 1727204502.98801: entering _queue_task() for managed-node3/include_tasks 34886 1727204502.99019: worker is 1 (out of 1 available) 34886 1727204502.99033: exiting _queue_task() for managed-node3/include_tasks 34886 1727204502.99046: done queuing things up, now waiting for results queue to drain 34886 1727204502.99048: waiting for pending results... 34886 1727204502.99229: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' 34886 1727204502.99297: in run() - task 12b410aa-8751-04b9-2e74-00000000005d 34886 1727204502.99309: variable 'ansible_search_path' from source: unknown 34886 1727204502.99343: calling self._execute() 34886 1727204502.99419: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204502.99428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204502.99439: variable 'omit' from source: magic vars 34886 1727204502.99745: variable 'ansible_distribution_major_version' from source: facts 34886 1727204502.99756: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204502.99763: _execute() done 34886 1727204502.99767: dumping result to json 34886 1727204502.99771: done dumping result, returning 34886 1727204502.99778: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' [12b410aa-8751-04b9-2e74-00000000005d] 34886 1727204502.99784: sending task result for task 12b410aa-8751-04b9-2e74-00000000005d 34886 1727204502.99877: done sending task result for task 12b410aa-8751-04b9-2e74-00000000005d 34886 1727204502.99880: WORKER PROCESS EXITING 34886 1727204502.99914: no more pending results, returning what we have 34886 1727204502.99919: in VariableManager get_vars() 34886 1727204502.99965: Calling all_inventory to load vars for managed-node3 34886 1727204502.99968: Calling groups_inventory to load vars for managed-node3 34886 1727204502.99971: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204502.99981: Calling all_plugins_play to load vars for managed-node3 34886 1727204502.99984: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204502.99988: Calling groups_plugins_play to load vars for managed-node3 34886 1727204503.01284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204503.02843: done with get_vars() 34886 1727204503.02863: variable 'ansible_search_path' from source: unknown 34886 1727204503.02875: we have included files to process 34886 1727204503.02876: generating all_blocks data 34886 1727204503.02877: done generating all_blocks data 34886 1727204503.02881: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 34886 1727204503.02881: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 34886 1727204503.02883: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 34886 1727204503.03045: in VariableManager get_vars() 34886 1727204503.03065: done with get_vars() 34886 1727204503.03284: done processing included file 34886 1727204503.03286: iterating over new_blocks loaded from include file 34886 1727204503.03288: in VariableManager get_vars() 34886 1727204503.03310: done with get_vars() 34886 1727204503.03312: filtering new block on tags 34886 1727204503.03335: done filtering new block on tags 34886 1727204503.03338: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 34886 1727204503.03344: extending task lists for all hosts with included blocks 34886 1727204503.06272: done extending task lists 34886 1727204503.06274: done processing included files 34886 1727204503.06275: results queue empty 34886 1727204503.06276: checking for any_errors_fatal 34886 1727204503.06281: done checking for any_errors_fatal 34886 1727204503.06282: checking for max_fail_percentage 34886 1727204503.06284: done checking for max_fail_percentage 34886 1727204503.06285: checking to see if all hosts have failed and the running result is not ok 34886 1727204503.06286: done checking to see if all hosts have failed 34886 1727204503.06286: getting the remaining hosts for this loop 34886 1727204503.06288: done getting the remaining hosts for this loop 34886 1727204503.06293: getting the next task for host managed-node3 34886 1727204503.06297: done getting next task for host managed-node3 34886 1727204503.06300: ^ task is: TASK: Include the task 'get_profile_stat.yml' 34886 1727204503.06303: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204503.06306: getting variables 34886 1727204503.06307: in VariableManager get_vars() 34886 1727204503.06327: Calling all_inventory to load vars for managed-node3 34886 1727204503.06330: Calling groups_inventory to load vars for managed-node3 34886 1727204503.06333: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204503.06341: Calling all_plugins_play to load vars for managed-node3 34886 1727204503.06345: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204503.06349: Calling groups_plugins_play to load vars for managed-node3 34886 1727204503.08274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204503.09843: done with get_vars() 34886 1727204503.09865: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.111) 0:00:21.267 ***** 34886 1727204503.09940: entering _queue_task() for managed-node3/include_tasks 34886 1727204503.10208: worker is 1 (out of 1 available) 34886 1727204503.10223: exiting _queue_task() for managed-node3/include_tasks 34886 1727204503.10236: done queuing things up, now waiting for results queue to drain 34886 1727204503.10238: waiting for pending results... 34886 1727204503.10431: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 34886 1727204503.10512: in run() - task 12b410aa-8751-04b9-2e74-0000000003b8 34886 1727204503.10527: variable 'ansible_search_path' from source: unknown 34886 1727204503.10531: variable 'ansible_search_path' from source: unknown 34886 1727204503.10562: calling self._execute() 34886 1727204503.10643: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.10650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.10660: variable 'omit' from source: magic vars 34886 1727204503.10978: variable 'ansible_distribution_major_version' from source: facts 34886 1727204503.10991: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204503.10998: _execute() done 34886 1727204503.11004: dumping result to json 34886 1727204503.11007: done dumping result, returning 34886 1727204503.11023: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-04b9-2e74-0000000003b8] 34886 1727204503.11026: sending task result for task 12b410aa-8751-04b9-2e74-0000000003b8 34886 1727204503.11112: done sending task result for task 12b410aa-8751-04b9-2e74-0000000003b8 34886 1727204503.11116: WORKER PROCESS EXITING 34886 1727204503.11148: no more pending results, returning what we have 34886 1727204503.11155: in VariableManager get_vars() 34886 1727204503.11206: Calling all_inventory to load vars for managed-node3 34886 1727204503.11210: Calling groups_inventory to load vars for managed-node3 34886 1727204503.11212: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204503.11228: Calling all_plugins_play to load vars for managed-node3 34886 1727204503.11232: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204503.11236: Calling groups_plugins_play to load vars for managed-node3 34886 1727204503.12462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204503.14096: done with get_vars() 34886 1727204503.14115: variable 'ansible_search_path' from source: unknown 34886 1727204503.14117: variable 'ansible_search_path' from source: unknown 34886 1727204503.14147: we have included files to process 34886 1727204503.14148: generating all_blocks data 34886 1727204503.14149: done generating all_blocks data 34886 1727204503.14150: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34886 1727204503.14151: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34886 1727204503.14152: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34886 1727204503.14981: done processing included file 34886 1727204503.14983: iterating over new_blocks loaded from include file 34886 1727204503.14984: in VariableManager get_vars() 34886 1727204503.15003: done with get_vars() 34886 1727204503.15004: filtering new block on tags 34886 1727204503.15024: done filtering new block on tags 34886 1727204503.15027: in VariableManager get_vars() 34886 1727204503.15040: done with get_vars() 34886 1727204503.15041: filtering new block on tags 34886 1727204503.15058: done filtering new block on tags 34886 1727204503.15060: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 34886 1727204503.15063: extending task lists for all hosts with included blocks 34886 1727204503.15194: done extending task lists 34886 1727204503.15196: done processing included files 34886 1727204503.15196: results queue empty 34886 1727204503.15197: checking for any_errors_fatal 34886 1727204503.15199: done checking for any_errors_fatal 34886 1727204503.15200: checking for max_fail_percentage 34886 1727204503.15200: done checking for max_fail_percentage 34886 1727204503.15201: checking to see if all hosts have failed and the running result is not ok 34886 1727204503.15202: done checking to see if all hosts have failed 34886 1727204503.15202: getting the remaining hosts for this loop 34886 1727204503.15203: done getting the remaining hosts for this loop 34886 1727204503.15205: getting the next task for host managed-node3 34886 1727204503.15208: done getting next task for host managed-node3 34886 1727204503.15210: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 34886 1727204503.15214: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204503.15216: getting variables 34886 1727204503.15217: in VariableManager get_vars() 34886 1727204503.15270: Calling all_inventory to load vars for managed-node3 34886 1727204503.15273: Calling groups_inventory to load vars for managed-node3 34886 1727204503.15274: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204503.15279: Calling all_plugins_play to load vars for managed-node3 34886 1727204503.15280: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204503.15283: Calling groups_plugins_play to load vars for managed-node3 34886 1727204503.16315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204503.17856: done with get_vars() 34886 1727204503.17875: done getting variables 34886 1727204503.17910: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.079) 0:00:21.347 ***** 34886 1727204503.17936: entering _queue_task() for managed-node3/set_fact 34886 1727204503.18196: worker is 1 (out of 1 available) 34886 1727204503.18209: exiting _queue_task() for managed-node3/set_fact 34886 1727204503.18223: done queuing things up, now waiting for results queue to drain 34886 1727204503.18225: waiting for pending results... 34886 1727204503.18425: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 34886 1727204503.18514: in run() - task 12b410aa-8751-04b9-2e74-0000000004b0 34886 1727204503.18531: variable 'ansible_search_path' from source: unknown 34886 1727204503.18536: variable 'ansible_search_path' from source: unknown 34886 1727204503.18571: calling self._execute() 34886 1727204503.18653: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.18660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.18673: variable 'omit' from source: magic vars 34886 1727204503.18992: variable 'ansible_distribution_major_version' from source: facts 34886 1727204503.19006: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204503.19013: variable 'omit' from source: magic vars 34886 1727204503.19053: variable 'omit' from source: magic vars 34886 1727204503.19083: variable 'omit' from source: magic vars 34886 1727204503.19122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204503.19157: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204503.19175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204503.19194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204503.19205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204503.19239: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204503.19242: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.19246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.19334: Set connection var ansible_timeout to 10 34886 1727204503.19342: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204503.19345: Set connection var ansible_connection to ssh 34886 1727204503.19351: Set connection var ansible_shell_executable to /bin/sh 34886 1727204503.19359: Set connection var ansible_pipelining to False 34886 1727204503.19363: Set connection var ansible_shell_type to sh 34886 1727204503.19384: variable 'ansible_shell_executable' from source: unknown 34886 1727204503.19388: variable 'ansible_connection' from source: unknown 34886 1727204503.19393: variable 'ansible_module_compression' from source: unknown 34886 1727204503.19397: variable 'ansible_shell_type' from source: unknown 34886 1727204503.19400: variable 'ansible_shell_executable' from source: unknown 34886 1727204503.19405: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.19410: variable 'ansible_pipelining' from source: unknown 34886 1727204503.19412: variable 'ansible_timeout' from source: unknown 34886 1727204503.19419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.19541: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204503.19553: variable 'omit' from source: magic vars 34886 1727204503.19557: starting attempt loop 34886 1727204503.19563: running the handler 34886 1727204503.19577: handler run complete 34886 1727204503.19586: attempt loop complete, returning result 34886 1727204503.19590: _execute() done 34886 1727204503.19593: dumping result to json 34886 1727204503.19598: done dumping result, returning 34886 1727204503.19605: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-04b9-2e74-0000000004b0] 34886 1727204503.19611: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b0 34886 1727204503.19701: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b0 34886 1727204503.19704: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 34886 1727204503.19765: no more pending results, returning what we have 34886 1727204503.19769: results queue empty 34886 1727204503.19771: checking for any_errors_fatal 34886 1727204503.19773: done checking for any_errors_fatal 34886 1727204503.19774: checking for max_fail_percentage 34886 1727204503.19776: done checking for max_fail_percentage 34886 1727204503.19777: checking to see if all hosts have failed and the running result is not ok 34886 1727204503.19778: done checking to see if all hosts have failed 34886 1727204503.19779: getting the remaining hosts for this loop 34886 1727204503.19780: done getting the remaining hosts for this loop 34886 1727204503.19784: getting the next task for host managed-node3 34886 1727204503.19793: done getting next task for host managed-node3 34886 1727204503.19796: ^ task is: TASK: Stat profile file 34886 1727204503.19800: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204503.19804: getting variables 34886 1727204503.19806: in VariableManager get_vars() 34886 1727204503.19846: Calling all_inventory to load vars for managed-node3 34886 1727204503.19849: Calling groups_inventory to load vars for managed-node3 34886 1727204503.19851: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204503.19862: Calling all_plugins_play to load vars for managed-node3 34886 1727204503.19865: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204503.19868: Calling groups_plugins_play to load vars for managed-node3 34886 1727204503.21146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204503.22735: done with get_vars() 34886 1727204503.22763: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.049) 0:00:21.396 ***** 34886 1727204503.22849: entering _queue_task() for managed-node3/stat 34886 1727204503.23130: worker is 1 (out of 1 available) 34886 1727204503.23146: exiting _queue_task() for managed-node3/stat 34886 1727204503.23162: done queuing things up, now waiting for results queue to drain 34886 1727204503.23164: waiting for pending results... 34886 1727204503.23360: running TaskExecutor() for managed-node3/TASK: Stat profile file 34886 1727204503.23449: in run() - task 12b410aa-8751-04b9-2e74-0000000004b1 34886 1727204503.23462: variable 'ansible_search_path' from source: unknown 34886 1727204503.23465: variable 'ansible_search_path' from source: unknown 34886 1727204503.23502: calling self._execute() 34886 1727204503.23583: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.23591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.23602: variable 'omit' from source: magic vars 34886 1727204503.23925: variable 'ansible_distribution_major_version' from source: facts 34886 1727204503.23934: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204503.23943: variable 'omit' from source: magic vars 34886 1727204503.23983: variable 'omit' from source: magic vars 34886 1727204503.24072: variable 'profile' from source: include params 34886 1727204503.24076: variable 'interface' from source: play vars 34886 1727204503.24136: variable 'interface' from source: play vars 34886 1727204503.24153: variable 'omit' from source: magic vars 34886 1727204503.24194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204503.24227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204503.24246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204503.24264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204503.24276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204503.24310: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204503.24314: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.24316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.24407: Set connection var ansible_timeout to 10 34886 1727204503.24413: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204503.24417: Set connection var ansible_connection to ssh 34886 1727204503.24425: Set connection var ansible_shell_executable to /bin/sh 34886 1727204503.24433: Set connection var ansible_pipelining to False 34886 1727204503.24436: Set connection var ansible_shell_type to sh 34886 1727204503.24459: variable 'ansible_shell_executable' from source: unknown 34886 1727204503.24462: variable 'ansible_connection' from source: unknown 34886 1727204503.24465: variable 'ansible_module_compression' from source: unknown 34886 1727204503.24470: variable 'ansible_shell_type' from source: unknown 34886 1727204503.24473: variable 'ansible_shell_executable' from source: unknown 34886 1727204503.24477: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.24482: variable 'ansible_pipelining' from source: unknown 34886 1727204503.24486: variable 'ansible_timeout' from source: unknown 34886 1727204503.24492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.24671: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204503.24681: variable 'omit' from source: magic vars 34886 1727204503.24687: starting attempt loop 34886 1727204503.24692: running the handler 34886 1727204503.24708: _low_level_execute_command(): starting 34886 1727204503.24725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204503.25274: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.25278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.25282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204503.25286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.25339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204503.25342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.25397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.27152: stdout chunk (state=3): >>>/root <<< 34886 1727204503.27261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.27318: stderr chunk (state=3): >>><<< 34886 1727204503.27322: stdout chunk (state=3): >>><<< 34886 1727204503.27350: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204503.27361: _low_level_execute_command(): starting 34886 1727204503.27368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912 `" && echo ansible-tmp-1727204503.273493-36096-148482076025912="` echo /root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912 `" ) && sleep 0' 34886 1727204503.27839: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204503.27842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204503.27845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.27855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.27908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204503.27916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.27953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.29939: stdout chunk (state=3): >>>ansible-tmp-1727204503.273493-36096-148482076025912=/root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912 <<< 34886 1727204503.30055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.30105: stderr chunk (state=3): >>><<< 34886 1727204503.30108: stdout chunk (state=3): >>><<< 34886 1727204503.30128: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204503.273493-36096-148482076025912=/root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204503.30169: variable 'ansible_module_compression' from source: unknown 34886 1727204503.30219: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 34886 1727204503.30256: variable 'ansible_facts' from source: unknown 34886 1727204503.30312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/AnsiballZ_stat.py 34886 1727204503.30431: Sending initial data 34886 1727204503.30435: Sent initial data (152 bytes) 34886 1727204503.30894: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204503.30898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204503.30900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.30906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.30953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204503.30969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.31006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.32612: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34886 1727204503.32617: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204503.32644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204503.32680: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpaylvjawv /root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/AnsiballZ_stat.py <<< 34886 1727204503.32684: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/AnsiballZ_stat.py" <<< 34886 1727204503.32712: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpaylvjawv" to remote "/root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/AnsiballZ_stat.py" <<< 34886 1727204503.33478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.33541: stderr chunk (state=3): >>><<< 34886 1727204503.33544: stdout chunk (state=3): >>><<< 34886 1727204503.33564: done transferring module to remote 34886 1727204503.33574: _low_level_execute_command(): starting 34886 1727204503.33579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/ /root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/AnsiballZ_stat.py && sleep 0' 34886 1727204503.34032: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204503.34036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204503.34041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204503.34044: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204503.34046: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.34102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204503.34105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.34138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.35942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.35986: stderr chunk (state=3): >>><<< 34886 1727204503.35991: stdout chunk (state=3): >>><<< 34886 1727204503.36006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204503.36009: _low_level_execute_command(): starting 34886 1727204503.36015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/AnsiballZ_stat.py && sleep 0' 34886 1727204503.36458: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.36461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204503.36464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204503.36469: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204503.36471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.36520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204503.36523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.36571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.54139: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 34886 1727204503.55659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204503.55720: stderr chunk (state=3): >>><<< 34886 1727204503.55724: stdout chunk (state=3): >>><<< 34886 1727204503.55741: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204503.55776: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204503.55785: _low_level_execute_command(): starting 34886 1727204503.55792: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204503.273493-36096-148482076025912/ > /dev/null 2>&1 && sleep 0' 34886 1727204503.56257: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.56303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204503.56306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204503.56309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204503.56311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204503.56313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.56358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204503.56362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.56413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.58347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.58401: stderr chunk (state=3): >>><<< 34886 1727204503.58405: stdout chunk (state=3): >>><<< 34886 1727204503.58422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204503.58428: handler run complete 34886 1727204503.58451: attempt loop complete, returning result 34886 1727204503.58454: _execute() done 34886 1727204503.58457: dumping result to json 34886 1727204503.58462: done dumping result, returning 34886 1727204503.58472: done running TaskExecutor() for managed-node3/TASK: Stat profile file [12b410aa-8751-04b9-2e74-0000000004b1] 34886 1727204503.58483: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b1 34886 1727204503.58591: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b1 34886 1727204503.58595: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 34886 1727204503.58664: no more pending results, returning what we have 34886 1727204503.58669: results queue empty 34886 1727204503.58670: checking for any_errors_fatal 34886 1727204503.58678: done checking for any_errors_fatal 34886 1727204503.58679: checking for max_fail_percentage 34886 1727204503.58681: done checking for max_fail_percentage 34886 1727204503.58682: checking to see if all hosts have failed and the running result is not ok 34886 1727204503.58683: done checking to see if all hosts have failed 34886 1727204503.58684: getting the remaining hosts for this loop 34886 1727204503.58685: done getting the remaining hosts for this loop 34886 1727204503.58698: getting the next task for host managed-node3 34886 1727204503.58706: done getting next task for host managed-node3 34886 1727204503.58709: ^ task is: TASK: Set NM profile exist flag based on the profile files 34886 1727204503.58714: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204503.58718: getting variables 34886 1727204503.58722: in VariableManager get_vars() 34886 1727204503.58765: Calling all_inventory to load vars for managed-node3 34886 1727204503.58769: Calling groups_inventory to load vars for managed-node3 34886 1727204503.58772: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204503.58783: Calling all_plugins_play to load vars for managed-node3 34886 1727204503.58786: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204503.58792: Calling groups_plugins_play to load vars for managed-node3 34886 1727204503.60136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204503.61708: done with get_vars() 34886 1727204503.61732: done getting variables 34886 1727204503.61786: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.389) 0:00:21.785 ***** 34886 1727204503.61813: entering _queue_task() for managed-node3/set_fact 34886 1727204503.62068: worker is 1 (out of 1 available) 34886 1727204503.62082: exiting _queue_task() for managed-node3/set_fact 34886 1727204503.62100: done queuing things up, now waiting for results queue to drain 34886 1727204503.62102: waiting for pending results... 34886 1727204503.62283: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 34886 1727204503.62371: in run() - task 12b410aa-8751-04b9-2e74-0000000004b2 34886 1727204503.62384: variable 'ansible_search_path' from source: unknown 34886 1727204503.62387: variable 'ansible_search_path' from source: unknown 34886 1727204503.62433: calling self._execute() 34886 1727204503.62508: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.62514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.62525: variable 'omit' from source: magic vars 34886 1727204503.62839: variable 'ansible_distribution_major_version' from source: facts 34886 1727204503.62851: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204503.62957: variable 'profile_stat' from source: set_fact 34886 1727204503.62969: Evaluated conditional (profile_stat.stat.exists): False 34886 1727204503.62973: when evaluation is False, skipping this task 34886 1727204503.62976: _execute() done 34886 1727204503.62979: dumping result to json 34886 1727204503.62983: done dumping result, returning 34886 1727204503.62995: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-04b9-2e74-0000000004b2] 34886 1727204503.63002: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b2 34886 1727204503.63166: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b2 34886 1727204503.63169: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34886 1727204503.63241: no more pending results, returning what we have 34886 1727204503.63244: results queue empty 34886 1727204503.63245: checking for any_errors_fatal 34886 1727204503.63251: done checking for any_errors_fatal 34886 1727204503.63252: checking for max_fail_percentage 34886 1727204503.63253: done checking for max_fail_percentage 34886 1727204503.63254: checking to see if all hosts have failed and the running result is not ok 34886 1727204503.63254: done checking to see if all hosts have failed 34886 1727204503.63255: getting the remaining hosts for this loop 34886 1727204503.63256: done getting the remaining hosts for this loop 34886 1727204503.63259: getting the next task for host managed-node3 34886 1727204503.63263: done getting next task for host managed-node3 34886 1727204503.63265: ^ task is: TASK: Get NM profile info 34886 1727204503.63268: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204503.63271: getting variables 34886 1727204503.63272: in VariableManager get_vars() 34886 1727204503.63305: Calling all_inventory to load vars for managed-node3 34886 1727204503.63307: Calling groups_inventory to load vars for managed-node3 34886 1727204503.63309: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204503.63317: Calling all_plugins_play to load vars for managed-node3 34886 1727204503.63321: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204503.63324: Calling groups_plugins_play to load vars for managed-node3 34886 1727204503.64491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204503.66072: done with get_vars() 34886 1727204503.66096: done getting variables 34886 1727204503.66145: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:01:43 -0400 (0:00:00.043) 0:00:21.829 ***** 34886 1727204503.66172: entering _queue_task() for managed-node3/shell 34886 1727204503.66400: worker is 1 (out of 1 available) 34886 1727204503.66416: exiting _queue_task() for managed-node3/shell 34886 1727204503.66432: done queuing things up, now waiting for results queue to drain 34886 1727204503.66434: waiting for pending results... 34886 1727204503.66613: running TaskExecutor() for managed-node3/TASK: Get NM profile info 34886 1727204503.66697: in run() - task 12b410aa-8751-04b9-2e74-0000000004b3 34886 1727204503.66711: variable 'ansible_search_path' from source: unknown 34886 1727204503.66714: variable 'ansible_search_path' from source: unknown 34886 1727204503.66747: calling self._execute() 34886 1727204503.66829: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.66838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.66848: variable 'omit' from source: magic vars 34886 1727204503.67152: variable 'ansible_distribution_major_version' from source: facts 34886 1727204503.67162: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204503.67172: variable 'omit' from source: magic vars 34886 1727204503.67211: variable 'omit' from source: magic vars 34886 1727204503.67295: variable 'profile' from source: include params 34886 1727204503.67299: variable 'interface' from source: play vars 34886 1727204503.67361: variable 'interface' from source: play vars 34886 1727204503.67378: variable 'omit' from source: magic vars 34886 1727204503.67418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204503.67450: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204503.67468: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204503.67484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204503.67498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204503.67526: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204503.67532: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.67535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.67623: Set connection var ansible_timeout to 10 34886 1727204503.67627: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204503.67629: Set connection var ansible_connection to ssh 34886 1727204503.67637: Set connection var ansible_shell_executable to /bin/sh 34886 1727204503.67651: Set connection var ansible_pipelining to False 34886 1727204503.67654: Set connection var ansible_shell_type to sh 34886 1727204503.67672: variable 'ansible_shell_executable' from source: unknown 34886 1727204503.67675: variable 'ansible_connection' from source: unknown 34886 1727204503.67680: variable 'ansible_module_compression' from source: unknown 34886 1727204503.67683: variable 'ansible_shell_type' from source: unknown 34886 1727204503.67687: variable 'ansible_shell_executable' from source: unknown 34886 1727204503.67692: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204503.67698: variable 'ansible_pipelining' from source: unknown 34886 1727204503.67701: variable 'ansible_timeout' from source: unknown 34886 1727204503.67707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204503.67828: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204503.67838: variable 'omit' from source: magic vars 34886 1727204503.67844: starting attempt loop 34886 1727204503.67847: running the handler 34886 1727204503.67858: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204503.67879: _low_level_execute_command(): starting 34886 1727204503.67886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204503.68441: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.68445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.68449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.68451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.68510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204503.68518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.68558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.70292: stdout chunk (state=3): >>>/root <<< 34886 1727204503.70402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.70454: stderr chunk (state=3): >>><<< 34886 1727204503.70458: stdout chunk (state=3): >>><<< 34886 1727204503.70480: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204503.70493: _low_level_execute_command(): starting 34886 1727204503.70501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155 `" && echo ansible-tmp-1727204503.7047958-36105-136747082288155="` echo /root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155 `" ) && sleep 0' 34886 1727204503.70974: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.70987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.70991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.70994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.71041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204503.71047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.71086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.73044: stdout chunk (state=3): >>>ansible-tmp-1727204503.7047958-36105-136747082288155=/root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155 <<< 34886 1727204503.73204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.73211: stderr chunk (state=3): >>><<< 34886 1727204503.73215: stdout chunk (state=3): >>><<< 34886 1727204503.73260: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204503.7047958-36105-136747082288155=/root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204503.73263: variable 'ansible_module_compression' from source: unknown 34886 1727204503.73305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204503.73341: variable 'ansible_facts' from source: unknown 34886 1727204503.73394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/AnsiballZ_command.py 34886 1727204503.73510: Sending initial data 34886 1727204503.73514: Sent initial data (156 bytes) 34886 1727204503.73980: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204503.73983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204503.73986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204503.73992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204503.73994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.74045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204503.74050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.74086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.75679: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204503.75694: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204503.75718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204503.75749: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpo6_7lpp6 /root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/AnsiballZ_command.py <<< 34886 1727204503.75758: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/AnsiballZ_command.py" <<< 34886 1727204503.75781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpo6_7lpp6" to remote "/root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/AnsiballZ_command.py" <<< 34886 1727204503.76543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.76605: stderr chunk (state=3): >>><<< 34886 1727204503.76608: stdout chunk (state=3): >>><<< 34886 1727204503.76692: done transferring module to remote 34886 1727204503.76696: _low_level_execute_command(): starting 34886 1727204503.76699: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/ /root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/AnsiballZ_command.py && sleep 0' 34886 1727204503.77087: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.77093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.77096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.77101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.77162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204503.77164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.77197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.79037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204503.79080: stderr chunk (state=3): >>><<< 34886 1727204503.79083: stdout chunk (state=3): >>><<< 34886 1727204503.79104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204503.79108: _low_level_execute_command(): starting 34886 1727204503.79114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/AnsiballZ_command.py && sleep 0' 34886 1727204503.79568: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204503.79572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204503.79574: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.79576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204503.79579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204503.79632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204503.79640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204503.79679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204503.99007: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 15:01:43.969132", "end": "2024-09-24 15:01:43.988867", "delta": "0:00:00.019735", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204504.00685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204504.00753: stderr chunk (state=3): >>><<< 34886 1727204504.00757: stdout chunk (state=3): >>><<< 34886 1727204504.00776: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 15:01:43.969132", "end": "2024-09-24 15:01:43.988867", "delta": "0:00:00.019735", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204504.00816: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204504.00826: _low_level_execute_command(): starting 34886 1727204504.00836: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204503.7047958-36105-136747082288155/ > /dev/null 2>&1 && sleep 0' 34886 1727204504.01338: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204504.01342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204504.01349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204504.01352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204504.01396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204504.01415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204504.01450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204504.03332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204504.03384: stderr chunk (state=3): >>><<< 34886 1727204504.03388: stdout chunk (state=3): >>><<< 34886 1727204504.03408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204504.03415: handler run complete 34886 1727204504.03439: Evaluated conditional (False): False 34886 1727204504.03453: attempt loop complete, returning result 34886 1727204504.03456: _execute() done 34886 1727204504.03458: dumping result to json 34886 1727204504.03465: done dumping result, returning 34886 1727204504.03473: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [12b410aa-8751-04b9-2e74-0000000004b3] 34886 1727204504.03479: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b3 34886 1727204504.03591: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b3 34886 1727204504.03595: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.019735", "end": "2024-09-24 15:01:43.988867", "rc": 0, "start": "2024-09-24 15:01:43.969132" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 34886 1727204504.03681: no more pending results, returning what we have 34886 1727204504.03685: results queue empty 34886 1727204504.03687: checking for any_errors_fatal 34886 1727204504.03696: done checking for any_errors_fatal 34886 1727204504.03697: checking for max_fail_percentage 34886 1727204504.03699: done checking for max_fail_percentage 34886 1727204504.03700: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.03701: done checking to see if all hosts have failed 34886 1727204504.03702: getting the remaining hosts for this loop 34886 1727204504.03705: done getting the remaining hosts for this loop 34886 1727204504.03709: getting the next task for host managed-node3 34886 1727204504.03716: done getting next task for host managed-node3 34886 1727204504.03721: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 34886 1727204504.03725: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.03729: getting variables 34886 1727204504.03731: in VariableManager get_vars() 34886 1727204504.03773: Calling all_inventory to load vars for managed-node3 34886 1727204504.03777: Calling groups_inventory to load vars for managed-node3 34886 1727204504.03779: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.03798: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.03803: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.03807: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.05192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.06752: done with get_vars() 34886 1727204504.06773: done getting variables 34886 1727204504.06827: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.406) 0:00:22.236 ***** 34886 1727204504.06855: entering _queue_task() for managed-node3/set_fact 34886 1727204504.07113: worker is 1 (out of 1 available) 34886 1727204504.07131: exiting _queue_task() for managed-node3/set_fact 34886 1727204504.07145: done queuing things up, now waiting for results queue to drain 34886 1727204504.07147: waiting for pending results... 34886 1727204504.07360: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 34886 1727204504.07464: in run() - task 12b410aa-8751-04b9-2e74-0000000004b4 34886 1727204504.07478: variable 'ansible_search_path' from source: unknown 34886 1727204504.07482: variable 'ansible_search_path' from source: unknown 34886 1727204504.07517: calling self._execute() 34886 1727204504.07599: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.07612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.07622: variable 'omit' from source: magic vars 34886 1727204504.07932: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.07944: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.08055: variable 'nm_profile_exists' from source: set_fact 34886 1727204504.08071: Evaluated conditional (nm_profile_exists.rc == 0): True 34886 1727204504.08077: variable 'omit' from source: magic vars 34886 1727204504.08115: variable 'omit' from source: magic vars 34886 1727204504.08144: variable 'omit' from source: magic vars 34886 1727204504.08181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204504.08295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204504.08299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204504.08302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.08304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.08333: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204504.08342: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.08350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.08480: Set connection var ansible_timeout to 10 34886 1727204504.08498: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204504.08507: Set connection var ansible_connection to ssh 34886 1727204504.08523: Set connection var ansible_shell_executable to /bin/sh 34886 1727204504.08540: Set connection var ansible_pipelining to False 34886 1727204504.08549: Set connection var ansible_shell_type to sh 34886 1727204504.08582: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.08593: variable 'ansible_connection' from source: unknown 34886 1727204504.08602: variable 'ansible_module_compression' from source: unknown 34886 1727204504.08694: variable 'ansible_shell_type' from source: unknown 34886 1727204504.08698: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.08701: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.08703: variable 'ansible_pipelining' from source: unknown 34886 1727204504.08705: variable 'ansible_timeout' from source: unknown 34886 1727204504.08708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.08821: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204504.08842: variable 'omit' from source: magic vars 34886 1727204504.08856: starting attempt loop 34886 1727204504.08865: running the handler 34886 1727204504.08890: handler run complete 34886 1727204504.08910: attempt loop complete, returning result 34886 1727204504.08920: _execute() done 34886 1727204504.08929: dumping result to json 34886 1727204504.08939: done dumping result, returning 34886 1727204504.08953: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-04b9-2e74-0000000004b4] 34886 1727204504.08967: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b4 ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 34886 1727204504.09146: no more pending results, returning what we have 34886 1727204504.09150: results queue empty 34886 1727204504.09152: checking for any_errors_fatal 34886 1727204504.09161: done checking for any_errors_fatal 34886 1727204504.09162: checking for max_fail_percentage 34886 1727204504.09164: done checking for max_fail_percentage 34886 1727204504.09165: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.09166: done checking to see if all hosts have failed 34886 1727204504.09167: getting the remaining hosts for this loop 34886 1727204504.09168: done getting the remaining hosts for this loop 34886 1727204504.09174: getting the next task for host managed-node3 34886 1727204504.09183: done getting next task for host managed-node3 34886 1727204504.09186: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 34886 1727204504.09192: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.09197: getting variables 34886 1727204504.09199: in VariableManager get_vars() 34886 1727204504.09410: Calling all_inventory to load vars for managed-node3 34886 1727204504.09414: Calling groups_inventory to load vars for managed-node3 34886 1727204504.09416: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.09430: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.09433: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.09438: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.09979: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b4 34886 1727204504.09983: WORKER PROCESS EXITING 34886 1727204504.11711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.13980: done with get_vars() 34886 1727204504.14006: done getting variables 34886 1727204504.14058: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204504.14158: variable 'profile' from source: include params 34886 1727204504.14162: variable 'interface' from source: play vars 34886 1727204504.14217: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.073) 0:00:22.310 ***** 34886 1727204504.14250: entering _queue_task() for managed-node3/command 34886 1727204504.14495: worker is 1 (out of 1 available) 34886 1727204504.14510: exiting _queue_task() for managed-node3/command 34886 1727204504.14523: done queuing things up, now waiting for results queue to drain 34886 1727204504.14525: waiting for pending results... 34886 1727204504.14708: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-veth0 34886 1727204504.14804: in run() - task 12b410aa-8751-04b9-2e74-0000000004b6 34886 1727204504.14818: variable 'ansible_search_path' from source: unknown 34886 1727204504.14822: variable 'ansible_search_path' from source: unknown 34886 1727204504.14857: calling self._execute() 34886 1727204504.14942: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.14949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.14958: variable 'omit' from source: magic vars 34886 1727204504.15372: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.15382: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.15493: variable 'profile_stat' from source: set_fact 34886 1727204504.15505: Evaluated conditional (profile_stat.stat.exists): False 34886 1727204504.15509: when evaluation is False, skipping this task 34886 1727204504.15512: _execute() done 34886 1727204504.15517: dumping result to json 34886 1727204504.15523: done dumping result, returning 34886 1727204504.15526: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [12b410aa-8751-04b9-2e74-0000000004b6] 34886 1727204504.15536: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b6 34886 1727204504.15629: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b6 34886 1727204504.15632: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34886 1727204504.15705: no more pending results, returning what we have 34886 1727204504.15710: results queue empty 34886 1727204504.15711: checking for any_errors_fatal 34886 1727204504.15717: done checking for any_errors_fatal 34886 1727204504.15718: checking for max_fail_percentage 34886 1727204504.15722: done checking for max_fail_percentage 34886 1727204504.15723: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.15724: done checking to see if all hosts have failed 34886 1727204504.15725: getting the remaining hosts for this loop 34886 1727204504.15726: done getting the remaining hosts for this loop 34886 1727204504.15730: getting the next task for host managed-node3 34886 1727204504.15737: done getting next task for host managed-node3 34886 1727204504.15739: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 34886 1727204504.15743: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.15747: getting variables 34886 1727204504.15748: in VariableManager get_vars() 34886 1727204504.15784: Calling all_inventory to load vars for managed-node3 34886 1727204504.15787: Calling groups_inventory to load vars for managed-node3 34886 1727204504.15794: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.15805: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.15809: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.15812: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.17054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.19690: done with get_vars() 34886 1727204504.19714: done getting variables 34886 1727204504.19766: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204504.19862: variable 'profile' from source: include params 34886 1727204504.19865: variable 'interface' from source: play vars 34886 1727204504.19915: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.056) 0:00:22.367 ***** 34886 1727204504.19943: entering _queue_task() for managed-node3/set_fact 34886 1727204504.20200: worker is 1 (out of 1 available) 34886 1727204504.20217: exiting _queue_task() for managed-node3/set_fact 34886 1727204504.20232: done queuing things up, now waiting for results queue to drain 34886 1727204504.20234: waiting for pending results... 34886 1727204504.20442: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 34886 1727204504.20543: in run() - task 12b410aa-8751-04b9-2e74-0000000004b7 34886 1727204504.20556: variable 'ansible_search_path' from source: unknown 34886 1727204504.20561: variable 'ansible_search_path' from source: unknown 34886 1727204504.20596: calling self._execute() 34886 1727204504.20749: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.20755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.20759: variable 'omit' from source: magic vars 34886 1727204504.21013: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.21026: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.21131: variable 'profile_stat' from source: set_fact 34886 1727204504.21144: Evaluated conditional (profile_stat.stat.exists): False 34886 1727204504.21147: when evaluation is False, skipping this task 34886 1727204504.21152: _execute() done 34886 1727204504.21156: dumping result to json 34886 1727204504.21161: done dumping result, returning 34886 1727204504.21167: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [12b410aa-8751-04b9-2e74-0000000004b7] 34886 1727204504.21173: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b7 34886 1727204504.21266: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b7 34886 1727204504.21269: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34886 1727204504.21323: no more pending results, returning what we have 34886 1727204504.21328: results queue empty 34886 1727204504.21329: checking for any_errors_fatal 34886 1727204504.21338: done checking for any_errors_fatal 34886 1727204504.21339: checking for max_fail_percentage 34886 1727204504.21341: done checking for max_fail_percentage 34886 1727204504.21342: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.21343: done checking to see if all hosts have failed 34886 1727204504.21344: getting the remaining hosts for this loop 34886 1727204504.21346: done getting the remaining hosts for this loop 34886 1727204504.21351: getting the next task for host managed-node3 34886 1727204504.21359: done getting next task for host managed-node3 34886 1727204504.21362: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 34886 1727204504.21366: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.21370: getting variables 34886 1727204504.21371: in VariableManager get_vars() 34886 1727204504.21422: Calling all_inventory to load vars for managed-node3 34886 1727204504.21426: Calling groups_inventory to load vars for managed-node3 34886 1727204504.21429: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.21443: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.21447: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.21451: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.28288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.31254: done with get_vars() 34886 1727204504.31292: done getting variables 34886 1727204504.31350: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204504.31461: variable 'profile' from source: include params 34886 1727204504.31469: variable 'interface' from source: play vars 34886 1727204504.31541: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.116) 0:00:22.483 ***** 34886 1727204504.31577: entering _queue_task() for managed-node3/command 34886 1727204504.31947: worker is 1 (out of 1 available) 34886 1727204504.31964: exiting _queue_task() for managed-node3/command 34886 1727204504.31980: done queuing things up, now waiting for results queue to drain 34886 1727204504.31982: waiting for pending results... 34886 1727204504.32313: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-veth0 34886 1727204504.32471: in run() - task 12b410aa-8751-04b9-2e74-0000000004b8 34886 1727204504.32475: variable 'ansible_search_path' from source: unknown 34886 1727204504.32478: variable 'ansible_search_path' from source: unknown 34886 1727204504.32492: calling self._execute() 34886 1727204504.32629: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.32645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.32665: variable 'omit' from source: magic vars 34886 1727204504.33133: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.33156: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.33325: variable 'profile_stat' from source: set_fact 34886 1727204504.33394: Evaluated conditional (profile_stat.stat.exists): False 34886 1727204504.33398: when evaluation is False, skipping this task 34886 1727204504.33401: _execute() done 34886 1727204504.33404: dumping result to json 34886 1727204504.33407: done dumping result, returning 34886 1727204504.33410: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-veth0 [12b410aa-8751-04b9-2e74-0000000004b8] 34886 1727204504.33422: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b8 34886 1727204504.33572: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b8 34886 1727204504.33576: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34886 1727204504.33635: no more pending results, returning what we have 34886 1727204504.33640: results queue empty 34886 1727204504.33641: checking for any_errors_fatal 34886 1727204504.33650: done checking for any_errors_fatal 34886 1727204504.33651: checking for max_fail_percentage 34886 1727204504.33653: done checking for max_fail_percentage 34886 1727204504.33654: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.33655: done checking to see if all hosts have failed 34886 1727204504.33656: getting the remaining hosts for this loop 34886 1727204504.33657: done getting the remaining hosts for this loop 34886 1727204504.33661: getting the next task for host managed-node3 34886 1727204504.33669: done getting next task for host managed-node3 34886 1727204504.33671: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 34886 1727204504.33676: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.33681: getting variables 34886 1727204504.33682: in VariableManager get_vars() 34886 1727204504.33731: Calling all_inventory to load vars for managed-node3 34886 1727204504.33735: Calling groups_inventory to load vars for managed-node3 34886 1727204504.33737: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.33752: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.33755: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.33759: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.36408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.38364: done with get_vars() 34886 1727204504.38388: done getting variables 34886 1727204504.38442: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204504.38540: variable 'profile' from source: include params 34886 1727204504.38544: variable 'interface' from source: play vars 34886 1727204504.38597: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.070) 0:00:22.554 ***** 34886 1727204504.38639: entering _queue_task() for managed-node3/set_fact 34886 1727204504.39012: worker is 1 (out of 1 available) 34886 1727204504.39031: exiting _queue_task() for managed-node3/set_fact 34886 1727204504.39045: done queuing things up, now waiting for results queue to drain 34886 1727204504.39047: waiting for pending results... 34886 1727204504.39446: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-veth0 34886 1727204504.39543: in run() - task 12b410aa-8751-04b9-2e74-0000000004b9 34886 1727204504.39548: variable 'ansible_search_path' from source: unknown 34886 1727204504.39553: variable 'ansible_search_path' from source: unknown 34886 1727204504.39655: calling self._execute() 34886 1727204504.39816: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.39820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.39825: variable 'omit' from source: magic vars 34886 1727204504.40156: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.40169: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.40273: variable 'profile_stat' from source: set_fact 34886 1727204504.40287: Evaluated conditional (profile_stat.stat.exists): False 34886 1727204504.40292: when evaluation is False, skipping this task 34886 1727204504.40295: _execute() done 34886 1727204504.40298: dumping result to json 34886 1727204504.40309: done dumping result, returning 34886 1727204504.40312: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [12b410aa-8751-04b9-2e74-0000000004b9] 34886 1727204504.40315: sending task result for task 12b410aa-8751-04b9-2e74-0000000004b9 34886 1727204504.40412: done sending task result for task 12b410aa-8751-04b9-2e74-0000000004b9 34886 1727204504.40415: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34886 1727204504.40466: no more pending results, returning what we have 34886 1727204504.40471: results queue empty 34886 1727204504.40473: checking for any_errors_fatal 34886 1727204504.40479: done checking for any_errors_fatal 34886 1727204504.40480: checking for max_fail_percentage 34886 1727204504.40481: done checking for max_fail_percentage 34886 1727204504.40482: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.40483: done checking to see if all hosts have failed 34886 1727204504.40484: getting the remaining hosts for this loop 34886 1727204504.40485: done getting the remaining hosts for this loop 34886 1727204504.40492: getting the next task for host managed-node3 34886 1727204504.40501: done getting next task for host managed-node3 34886 1727204504.40505: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 34886 1727204504.40508: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.40512: getting variables 34886 1727204504.40514: in VariableManager get_vars() 34886 1727204504.40558: Calling all_inventory to load vars for managed-node3 34886 1727204504.40561: Calling groups_inventory to load vars for managed-node3 34886 1727204504.40564: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.40576: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.40579: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.40583: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.41821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.44813: done with get_vars() 34886 1727204504.44855: done getting variables 34886 1727204504.44936: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204504.45076: variable 'profile' from source: include params 34886 1727204504.45081: variable 'interface' from source: play vars 34886 1727204504.45164: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.065) 0:00:22.619 ***** 34886 1727204504.45201: entering _queue_task() for managed-node3/assert 34886 1727204504.45593: worker is 1 (out of 1 available) 34886 1727204504.45608: exiting _queue_task() for managed-node3/assert 34886 1727204504.45621: done queuing things up, now waiting for results queue to drain 34886 1727204504.45623: waiting for pending results... 34886 1727204504.46012: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'veth0' 34886 1727204504.46024: in run() - task 12b410aa-8751-04b9-2e74-0000000003b9 34886 1727204504.46042: variable 'ansible_search_path' from source: unknown 34886 1727204504.46046: variable 'ansible_search_path' from source: unknown 34886 1727204504.46099: calling self._execute() 34886 1727204504.46227: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.46242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.46257: variable 'omit' from source: magic vars 34886 1727204504.46894: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.46899: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.46902: variable 'omit' from source: magic vars 34886 1727204504.46904: variable 'omit' from source: magic vars 34886 1727204504.46907: variable 'profile' from source: include params 34886 1727204504.46915: variable 'interface' from source: play vars 34886 1727204504.46988: variable 'interface' from source: play vars 34886 1727204504.47017: variable 'omit' from source: magic vars 34886 1727204504.47065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204504.47115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204504.47142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204504.47166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.47182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.47229: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204504.47233: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.47241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.47371: Set connection var ansible_timeout to 10 34886 1727204504.47395: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204504.47398: Set connection var ansible_connection to ssh 34886 1727204504.47401: Set connection var ansible_shell_executable to /bin/sh 34886 1727204504.47405: Set connection var ansible_pipelining to False 34886 1727204504.47407: Set connection var ansible_shell_type to sh 34886 1727204504.47462: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.47466: variable 'ansible_connection' from source: unknown 34886 1727204504.47469: variable 'ansible_module_compression' from source: unknown 34886 1727204504.47472: variable 'ansible_shell_type' from source: unknown 34886 1727204504.47474: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.47476: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.47478: variable 'ansible_pipelining' from source: unknown 34886 1727204504.47481: variable 'ansible_timeout' from source: unknown 34886 1727204504.47483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.47682: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204504.47686: variable 'omit' from source: magic vars 34886 1727204504.47688: starting attempt loop 34886 1727204504.47693: running the handler 34886 1727204504.47817: variable 'lsr_net_profile_exists' from source: set_fact 34886 1727204504.47898: Evaluated conditional (lsr_net_profile_exists): True 34886 1727204504.47902: handler run complete 34886 1727204504.47905: attempt loop complete, returning result 34886 1727204504.47907: _execute() done 34886 1727204504.47909: dumping result to json 34886 1727204504.47912: done dumping result, returning 34886 1727204504.47914: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'veth0' [12b410aa-8751-04b9-2e74-0000000003b9] 34886 1727204504.47916: sending task result for task 12b410aa-8751-04b9-2e74-0000000003b9 34886 1727204504.47980: done sending task result for task 12b410aa-8751-04b9-2e74-0000000003b9 34886 1727204504.47983: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 34886 1727204504.48051: no more pending results, returning what we have 34886 1727204504.48055: results queue empty 34886 1727204504.48056: checking for any_errors_fatal 34886 1727204504.48064: done checking for any_errors_fatal 34886 1727204504.48065: checking for max_fail_percentage 34886 1727204504.48067: done checking for max_fail_percentage 34886 1727204504.48068: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.48068: done checking to see if all hosts have failed 34886 1727204504.48069: getting the remaining hosts for this loop 34886 1727204504.48071: done getting the remaining hosts for this loop 34886 1727204504.48075: getting the next task for host managed-node3 34886 1727204504.48082: done getting next task for host managed-node3 34886 1727204504.48085: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 34886 1727204504.48088: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.48095: getting variables 34886 1727204504.48096: in VariableManager get_vars() 34886 1727204504.48142: Calling all_inventory to load vars for managed-node3 34886 1727204504.48146: Calling groups_inventory to load vars for managed-node3 34886 1727204504.48148: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.48159: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.48162: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.48168: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.50764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.53923: done with get_vars() 34886 1727204504.53978: done getting variables 34886 1727204504.54059: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204504.54213: variable 'profile' from source: include params 34886 1727204504.54218: variable 'interface' from source: play vars 34886 1727204504.54295: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.091) 0:00:22.711 ***** 34886 1727204504.54345: entering _queue_task() for managed-node3/assert 34886 1727204504.54862: worker is 1 (out of 1 available) 34886 1727204504.54876: exiting _queue_task() for managed-node3/assert 34886 1727204504.54888: done queuing things up, now waiting for results queue to drain 34886 1727204504.54893: waiting for pending results... 34886 1727204504.55187: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'veth0' 34886 1727204504.55253: in run() - task 12b410aa-8751-04b9-2e74-0000000003ba 34886 1727204504.55269: variable 'ansible_search_path' from source: unknown 34886 1727204504.55491: variable 'ansible_search_path' from source: unknown 34886 1727204504.55499: calling self._execute() 34886 1727204504.55504: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.55508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.55512: variable 'omit' from source: magic vars 34886 1727204504.55922: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.55939: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.55948: variable 'omit' from source: magic vars 34886 1727204504.56008: variable 'omit' from source: magic vars 34886 1727204504.56142: variable 'profile' from source: include params 34886 1727204504.56146: variable 'interface' from source: play vars 34886 1727204504.56233: variable 'interface' from source: play vars 34886 1727204504.56286: variable 'omit' from source: magic vars 34886 1727204504.56308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204504.56353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204504.56391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204504.56405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.56420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.56497: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204504.56501: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.56504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.56596: Set connection var ansible_timeout to 10 34886 1727204504.56609: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204504.56612: Set connection var ansible_connection to ssh 34886 1727204504.56629: Set connection var ansible_shell_executable to /bin/sh 34886 1727204504.56641: Set connection var ansible_pipelining to False 34886 1727204504.56645: Set connection var ansible_shell_type to sh 34886 1727204504.56694: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.56698: variable 'ansible_connection' from source: unknown 34886 1727204504.56701: variable 'ansible_module_compression' from source: unknown 34886 1727204504.56703: variable 'ansible_shell_type' from source: unknown 34886 1727204504.56706: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.56823: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.56828: variable 'ansible_pipelining' from source: unknown 34886 1727204504.56832: variable 'ansible_timeout' from source: unknown 34886 1727204504.56836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.56897: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204504.56910: variable 'omit' from source: magic vars 34886 1727204504.56918: starting attempt loop 34886 1727204504.56921: running the handler 34886 1727204504.57075: variable 'lsr_net_profile_ansible_managed' from source: set_fact 34886 1727204504.57082: Evaluated conditional (lsr_net_profile_ansible_managed): True 34886 1727204504.57092: handler run complete 34886 1727204504.57114: attempt loop complete, returning result 34886 1727204504.57117: _execute() done 34886 1727204504.57120: dumping result to json 34886 1727204504.57127: done dumping result, returning 34886 1727204504.57137: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'veth0' [12b410aa-8751-04b9-2e74-0000000003ba] 34886 1727204504.57148: sending task result for task 12b410aa-8751-04b9-2e74-0000000003ba 34886 1727204504.57366: done sending task result for task 12b410aa-8751-04b9-2e74-0000000003ba 34886 1727204504.57371: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 34886 1727204504.57439: no more pending results, returning what we have 34886 1727204504.57443: results queue empty 34886 1727204504.57445: checking for any_errors_fatal 34886 1727204504.57452: done checking for any_errors_fatal 34886 1727204504.57453: checking for max_fail_percentage 34886 1727204504.57455: done checking for max_fail_percentage 34886 1727204504.57457: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.57458: done checking to see if all hosts have failed 34886 1727204504.57459: getting the remaining hosts for this loop 34886 1727204504.57460: done getting the remaining hosts for this loop 34886 1727204504.57465: getting the next task for host managed-node3 34886 1727204504.57473: done getting next task for host managed-node3 34886 1727204504.57476: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 34886 1727204504.57479: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.57484: getting variables 34886 1727204504.57488: in VariableManager get_vars() 34886 1727204504.57542: Calling all_inventory to load vars for managed-node3 34886 1727204504.57546: Calling groups_inventory to load vars for managed-node3 34886 1727204504.57550: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.57563: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.57568: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.57572: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.60023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.63313: done with get_vars() 34886 1727204504.63350: done getting variables 34886 1727204504.63431: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204504.63569: variable 'profile' from source: include params 34886 1727204504.63573: variable 'interface' from source: play vars 34886 1727204504.63656: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.093) 0:00:22.804 ***** 34886 1727204504.63702: entering _queue_task() for managed-node3/assert 34886 1727204504.64087: worker is 1 (out of 1 available) 34886 1727204504.64206: exiting _queue_task() for managed-node3/assert 34886 1727204504.64217: done queuing things up, now waiting for results queue to drain 34886 1727204504.64222: waiting for pending results... 34886 1727204504.64437: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in veth0 34886 1727204504.64554: in run() - task 12b410aa-8751-04b9-2e74-0000000003bb 34886 1727204504.64569: variable 'ansible_search_path' from source: unknown 34886 1727204504.64572: variable 'ansible_search_path' from source: unknown 34886 1727204504.64795: calling self._execute() 34886 1727204504.64800: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.64803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.64806: variable 'omit' from source: magic vars 34886 1727204504.65208: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.65222: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.65233: variable 'omit' from source: magic vars 34886 1727204504.65286: variable 'omit' from source: magic vars 34886 1727204504.65417: variable 'profile' from source: include params 34886 1727204504.65421: variable 'interface' from source: play vars 34886 1727204504.65508: variable 'interface' from source: play vars 34886 1727204504.65534: variable 'omit' from source: magic vars 34886 1727204504.65580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204504.65632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204504.65656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204504.65683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.65693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.65743: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204504.65747: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.65750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.65907: Set connection var ansible_timeout to 10 34886 1727204504.65911: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204504.65913: Set connection var ansible_connection to ssh 34886 1727204504.65994: Set connection var ansible_shell_executable to /bin/sh 34886 1727204504.65998: Set connection var ansible_pipelining to False 34886 1727204504.66001: Set connection var ansible_shell_type to sh 34886 1727204504.66003: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.66006: variable 'ansible_connection' from source: unknown 34886 1727204504.66013: variable 'ansible_module_compression' from source: unknown 34886 1727204504.66017: variable 'ansible_shell_type' from source: unknown 34886 1727204504.66019: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.66022: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.66025: variable 'ansible_pipelining' from source: unknown 34886 1727204504.66028: variable 'ansible_timeout' from source: unknown 34886 1727204504.66032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.66206: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204504.66220: variable 'omit' from source: magic vars 34886 1727204504.66232: starting attempt loop 34886 1727204504.66236: running the handler 34886 1727204504.66398: variable 'lsr_net_profile_fingerprint' from source: set_fact 34886 1727204504.66404: Evaluated conditional (lsr_net_profile_fingerprint): True 34886 1727204504.66412: handler run complete 34886 1727204504.66439: attempt loop complete, returning result 34886 1727204504.66442: _execute() done 34886 1727204504.66450: dumping result to json 34886 1727204504.66453: done dumping result, returning 34886 1727204504.66456: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in veth0 [12b410aa-8751-04b9-2e74-0000000003bb] 34886 1727204504.66470: sending task result for task 12b410aa-8751-04b9-2e74-0000000003bb 34886 1727204504.66697: done sending task result for task 12b410aa-8751-04b9-2e74-0000000003bb 34886 1727204504.66701: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 34886 1727204504.66767: no more pending results, returning what we have 34886 1727204504.66771: results queue empty 34886 1727204504.66773: checking for any_errors_fatal 34886 1727204504.66781: done checking for any_errors_fatal 34886 1727204504.66783: checking for max_fail_percentage 34886 1727204504.66786: done checking for max_fail_percentage 34886 1727204504.66788: checking to see if all hosts have failed and the running result is not ok 34886 1727204504.66789: done checking to see if all hosts have failed 34886 1727204504.66790: getting the remaining hosts for this loop 34886 1727204504.66792: done getting the remaining hosts for this loop 34886 1727204504.66798: getting the next task for host managed-node3 34886 1727204504.66808: done getting next task for host managed-node3 34886 1727204504.66811: ^ task is: TASK: Get ip address information 34886 1727204504.66814: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204504.66818: getting variables 34886 1727204504.66823: in VariableManager get_vars() 34886 1727204504.66875: Calling all_inventory to load vars for managed-node3 34886 1727204504.66879: Calling groups_inventory to load vars for managed-node3 34886 1727204504.66882: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204504.67013: Calling all_plugins_play to load vars for managed-node3 34886 1727204504.67018: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204504.67025: Calling groups_plugins_play to load vars for managed-node3 34886 1727204504.69214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204504.72381: done with get_vars() 34886 1727204504.72430: done getting variables 34886 1727204504.72509: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Tuesday 24 September 2024 15:01:44 -0400 (0:00:00.088) 0:00:22.893 ***** 34886 1727204504.72546: entering _queue_task() for managed-node3/command 34886 1727204504.73035: worker is 1 (out of 1 available) 34886 1727204504.73049: exiting _queue_task() for managed-node3/command 34886 1727204504.73062: done queuing things up, now waiting for results queue to drain 34886 1727204504.73064: waiting for pending results... 34886 1727204504.73339: running TaskExecutor() for managed-node3/TASK: Get ip address information 34886 1727204504.73445: in run() - task 12b410aa-8751-04b9-2e74-00000000005e 34886 1727204504.73456: variable 'ansible_search_path' from source: unknown 34886 1727204504.73554: calling self._execute() 34886 1727204504.73632: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.73639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.73652: variable 'omit' from source: magic vars 34886 1727204504.74152: variable 'ansible_distribution_major_version' from source: facts 34886 1727204504.74165: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204504.74172: variable 'omit' from source: magic vars 34886 1727204504.74207: variable 'omit' from source: magic vars 34886 1727204504.74331: variable 'interface' from source: play vars 34886 1727204504.74355: variable 'omit' from source: magic vars 34886 1727204504.74402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204504.74455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204504.74478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204504.74502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.74530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204504.74562: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204504.74565: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.74571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.74712: Set connection var ansible_timeout to 10 34886 1727204504.74720: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204504.74748: Set connection var ansible_connection to ssh 34886 1727204504.74752: Set connection var ansible_shell_executable to /bin/sh 34886 1727204504.74756: Set connection var ansible_pipelining to False 34886 1727204504.74758: Set connection var ansible_shell_type to sh 34886 1727204504.74785: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.74791: variable 'ansible_connection' from source: unknown 34886 1727204504.74794: variable 'ansible_module_compression' from source: unknown 34886 1727204504.74797: variable 'ansible_shell_type' from source: unknown 34886 1727204504.74857: variable 'ansible_shell_executable' from source: unknown 34886 1727204504.74861: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204504.74863: variable 'ansible_pipelining' from source: unknown 34886 1727204504.74867: variable 'ansible_timeout' from source: unknown 34886 1727204504.74869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204504.75010: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204504.75026: variable 'omit' from source: magic vars 34886 1727204504.75032: starting attempt loop 34886 1727204504.75036: running the handler 34886 1727204504.75074: _low_level_execute_command(): starting 34886 1727204504.75078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204504.76015: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204504.76036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204504.76117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204504.77876: stdout chunk (state=3): >>>/root <<< 34886 1727204504.77977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204504.78083: stderr chunk (state=3): >>><<< 34886 1727204504.78097: stdout chunk (state=3): >>><<< 34886 1727204504.78138: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204504.78182: _low_level_execute_command(): starting 34886 1727204504.78186: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012 `" && echo ansible-tmp-1727204504.7814636-36132-74486312028012="` echo /root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012 `" ) && sleep 0' 34886 1727204504.78879: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204504.78898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204504.78915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204504.78939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204504.78971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204504.79006: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204504.79078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204504.79138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204504.79156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204504.79200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204504.79264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204504.81237: stdout chunk (state=3): >>>ansible-tmp-1727204504.7814636-36132-74486312028012=/root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012 <<< 34886 1727204504.81500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204504.81504: stdout chunk (state=3): >>><<< 34886 1727204504.81506: stderr chunk (state=3): >>><<< 34886 1727204504.81509: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204504.7814636-36132-74486312028012=/root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204504.81528: variable 'ansible_module_compression' from source: unknown 34886 1727204504.81595: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204504.81651: variable 'ansible_facts' from source: unknown 34886 1727204504.81770: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/AnsiballZ_command.py 34886 1727204504.81960: Sending initial data 34886 1727204504.81971: Sent initial data (155 bytes) 34886 1727204504.82706: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204504.82772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204504.82795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204504.82812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204504.82892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204504.84578: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204504.84584: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204504.84587: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpbqztqiy0 /root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/AnsiballZ_command.py <<< 34886 1727204504.84795: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/AnsiballZ_command.py" <<< 34886 1727204504.84819: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpbqztqiy0" to remote "/root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/AnsiballZ_command.py" <<< 34886 1727204504.85655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204504.85761: stderr chunk (state=3): >>><<< 34886 1727204504.85794: stdout chunk (state=3): >>><<< 34886 1727204504.85836: done transferring module to remote 34886 1727204504.85852: _low_level_execute_command(): starting 34886 1727204504.85861: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/ /root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/AnsiballZ_command.py && sleep 0' 34886 1727204504.86508: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204504.86522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204504.86581: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204504.86613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204504.86629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204504.86707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204504.88498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204504.88550: stderr chunk (state=3): >>><<< 34886 1727204504.88554: stdout chunk (state=3): >>><<< 34886 1727204504.88568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204504.88571: _low_level_execute_command(): starting 34886 1727204504.88577: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/AnsiballZ_command.py && sleep 0' 34886 1727204504.89205: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204504.89252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204504.89296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.06867: stdout chunk (state=3): >>> {"changed": true, "stdout": "30: veth0@if29: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 1e:53:10:b9:60:a4 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::e43c:8470:c89a:d659/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-24 15:01:45.063147", "end": "2024-09-24 15:01:45.067433", "delta": "0:00:00.004286", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204505.08553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204505.08614: stderr chunk (state=3): >>><<< 34886 1727204505.08618: stdout chunk (state=3): >>><<< 34886 1727204505.08636: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30: veth0@if29: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 1e:53:10:b9:60:a4 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::e43c:8470:c89a:d659/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-24 15:01:45.063147", "end": "2024-09-24 15:01:45.067433", "delta": "0:00:00.004286", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204505.08682: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204505.08692: _low_level_execute_command(): starting 34886 1727204505.08698: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204504.7814636-36132-74486312028012/ > /dev/null 2>&1 && sleep 0' 34886 1727204505.09185: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204505.09188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.09199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204505.09203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.09255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.09259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.09305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.11228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.11284: stderr chunk (state=3): >>><<< 34886 1727204505.11287: stdout chunk (state=3): >>><<< 34886 1727204505.11308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204505.11315: handler run complete 34886 1727204505.11341: Evaluated conditional (False): False 34886 1727204505.11357: attempt loop complete, returning result 34886 1727204505.11360: _execute() done 34886 1727204505.11362: dumping result to json 34886 1727204505.11370: done dumping result, returning 34886 1727204505.11379: done running TaskExecutor() for managed-node3/TASK: Get ip address information [12b410aa-8751-04b9-2e74-00000000005e] 34886 1727204505.11385: sending task result for task 12b410aa-8751-04b9-2e74-00000000005e 34886 1727204505.11502: done sending task result for task 12b410aa-8751-04b9-2e74-00000000005e 34886 1727204505.11505: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.004286", "end": "2024-09-24 15:01:45.067433", "rc": 0, "start": "2024-09-24 15:01:45.063147" } STDOUT: 30: veth0@if29: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether 1e:53:10:b9:60:a4 brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::e43c:8470:c89a:d659/64 scope link noprefixroute valid_lft forever preferred_lft forever 34886 1727204505.11612: no more pending results, returning what we have 34886 1727204505.11621: results queue empty 34886 1727204505.11622: checking for any_errors_fatal 34886 1727204505.11628: done checking for any_errors_fatal 34886 1727204505.11629: checking for max_fail_percentage 34886 1727204505.11631: done checking for max_fail_percentage 34886 1727204505.11632: checking to see if all hosts have failed and the running result is not ok 34886 1727204505.11634: done checking to see if all hosts have failed 34886 1727204505.11634: getting the remaining hosts for this loop 34886 1727204505.11636: done getting the remaining hosts for this loop 34886 1727204505.11641: getting the next task for host managed-node3 34886 1727204505.11646: done getting next task for host managed-node3 34886 1727204505.11650: ^ task is: TASK: Show ip_addr 34886 1727204505.11652: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204505.11655: getting variables 34886 1727204505.11657: in VariableManager get_vars() 34886 1727204505.11706: Calling all_inventory to load vars for managed-node3 34886 1727204505.11710: Calling groups_inventory to load vars for managed-node3 34886 1727204505.11713: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204505.11728: Calling all_plugins_play to load vars for managed-node3 34886 1727204505.11732: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204505.11736: Calling groups_plugins_play to load vars for managed-node3 34886 1727204505.13127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204505.15919: done with get_vars() 34886 1727204505.15958: done getting variables 34886 1727204505.16039: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.435) 0:00:23.328 ***** 34886 1727204505.16076: entering _queue_task() for managed-node3/debug 34886 1727204505.16401: worker is 1 (out of 1 available) 34886 1727204505.16417: exiting _queue_task() for managed-node3/debug 34886 1727204505.16431: done queuing things up, now waiting for results queue to drain 34886 1727204505.16433: waiting for pending results... 34886 1727204505.16821: running TaskExecutor() for managed-node3/TASK: Show ip_addr 34886 1727204505.16898: in run() - task 12b410aa-8751-04b9-2e74-00000000005f 34886 1727204505.16903: variable 'ansible_search_path' from source: unknown 34886 1727204505.16951: calling self._execute() 34886 1727204505.17135: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.17140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.17143: variable 'omit' from source: magic vars 34886 1727204505.17622: variable 'ansible_distribution_major_version' from source: facts 34886 1727204505.17645: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204505.17660: variable 'omit' from source: magic vars 34886 1727204505.17700: variable 'omit' from source: magic vars 34886 1727204505.17792: variable 'omit' from source: magic vars 34886 1727204505.17818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204505.17872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204505.17912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204505.17941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.17964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.18094: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204505.18098: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.18101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.18167: Set connection var ansible_timeout to 10 34886 1727204505.18182: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204505.18195: Set connection var ansible_connection to ssh 34886 1727204505.18210: Set connection var ansible_shell_executable to /bin/sh 34886 1727204505.18231: Set connection var ansible_pipelining to False 34886 1727204505.18235: Set connection var ansible_shell_type to sh 34886 1727204505.18256: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.18268: variable 'ansible_connection' from source: unknown 34886 1727204505.18272: variable 'ansible_module_compression' from source: unknown 34886 1727204505.18275: variable 'ansible_shell_type' from source: unknown 34886 1727204505.18277: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.18280: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.18282: variable 'ansible_pipelining' from source: unknown 34886 1727204505.18285: variable 'ansible_timeout' from source: unknown 34886 1727204505.18287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.18414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204505.18427: variable 'omit' from source: magic vars 34886 1727204505.18433: starting attempt loop 34886 1727204505.18437: running the handler 34886 1727204505.18547: variable 'ip_addr' from source: set_fact 34886 1727204505.18563: handler run complete 34886 1727204505.18579: attempt loop complete, returning result 34886 1727204505.18583: _execute() done 34886 1727204505.18587: dumping result to json 34886 1727204505.18590: done dumping result, returning 34886 1727204505.18602: done running TaskExecutor() for managed-node3/TASK: Show ip_addr [12b410aa-8751-04b9-2e74-00000000005f] 34886 1727204505.18605: sending task result for task 12b410aa-8751-04b9-2e74-00000000005f 34886 1727204505.18694: done sending task result for task 12b410aa-8751-04b9-2e74-00000000005f 34886 1727204505.18697: WORKER PROCESS EXITING ok: [managed-node3] => { "ip_addr.stdout": "30: veth0@if29: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 1e:53:10:b9:60:a4 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::e43c:8470:c89a:d659/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 34886 1727204505.18750: no more pending results, returning what we have 34886 1727204505.18754: results queue empty 34886 1727204505.18755: checking for any_errors_fatal 34886 1727204505.18766: done checking for any_errors_fatal 34886 1727204505.18767: checking for max_fail_percentage 34886 1727204505.18769: done checking for max_fail_percentage 34886 1727204505.18770: checking to see if all hosts have failed and the running result is not ok 34886 1727204505.18771: done checking to see if all hosts have failed 34886 1727204505.18772: getting the remaining hosts for this loop 34886 1727204505.18773: done getting the remaining hosts for this loop 34886 1727204505.18778: getting the next task for host managed-node3 34886 1727204505.18784: done getting next task for host managed-node3 34886 1727204505.18787: ^ task is: TASK: Assert ipv6 addresses are correctly set 34886 1727204505.18797: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204505.18801: getting variables 34886 1727204505.18802: in VariableManager get_vars() 34886 1727204505.18847: Calling all_inventory to load vars for managed-node3 34886 1727204505.18850: Calling groups_inventory to load vars for managed-node3 34886 1727204505.18853: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204505.18863: Calling all_plugins_play to load vars for managed-node3 34886 1727204505.18866: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204505.18870: Calling groups_plugins_play to load vars for managed-node3 34886 1727204505.20096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204505.21773: done with get_vars() 34886 1727204505.21798: done getting variables 34886 1727204505.21852: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.057) 0:00:23.386 ***** 34886 1727204505.21876: entering _queue_task() for managed-node3/assert 34886 1727204505.22138: worker is 1 (out of 1 available) 34886 1727204505.22152: exiting _queue_task() for managed-node3/assert 34886 1727204505.22167: done queuing things up, now waiting for results queue to drain 34886 1727204505.22169: waiting for pending results... 34886 1727204505.22361: running TaskExecutor() for managed-node3/TASK: Assert ipv6 addresses are correctly set 34886 1727204505.22433: in run() - task 12b410aa-8751-04b9-2e74-000000000060 34886 1727204505.22446: variable 'ansible_search_path' from source: unknown 34886 1727204505.22480: calling self._execute() 34886 1727204505.22566: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.22573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.22583: variable 'omit' from source: magic vars 34886 1727204505.22901: variable 'ansible_distribution_major_version' from source: facts 34886 1727204505.22911: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204505.22918: variable 'omit' from source: magic vars 34886 1727204505.22936: variable 'omit' from source: magic vars 34886 1727204505.22972: variable 'omit' from source: magic vars 34886 1727204505.23009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204505.23043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204505.23065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204505.23081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.23093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.23124: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204505.23128: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.23131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.23222: Set connection var ansible_timeout to 10 34886 1727204505.23226: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204505.23229: Set connection var ansible_connection to ssh 34886 1727204505.23236: Set connection var ansible_shell_executable to /bin/sh 34886 1727204505.23244: Set connection var ansible_pipelining to False 34886 1727204505.23247: Set connection var ansible_shell_type to sh 34886 1727204505.23274: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.23279: variable 'ansible_connection' from source: unknown 34886 1727204505.23282: variable 'ansible_module_compression' from source: unknown 34886 1727204505.23285: variable 'ansible_shell_type' from source: unknown 34886 1727204505.23288: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.23292: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.23294: variable 'ansible_pipelining' from source: unknown 34886 1727204505.23297: variable 'ansible_timeout' from source: unknown 34886 1727204505.23299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.23423: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204505.23431: variable 'omit' from source: magic vars 34886 1727204505.23438: starting attempt loop 34886 1727204505.23441: running the handler 34886 1727204505.23562: variable 'ip_addr' from source: set_fact 34886 1727204505.23575: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 34886 1727204505.23683: variable 'ip_addr' from source: set_fact 34886 1727204505.23693: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 34886 1727204505.23797: variable 'ip_addr' from source: set_fact 34886 1727204505.23806: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 34886 1727204505.23815: handler run complete 34886 1727204505.23833: attempt loop complete, returning result 34886 1727204505.23837: _execute() done 34886 1727204505.23840: dumping result to json 34886 1727204505.23843: done dumping result, returning 34886 1727204505.23850: done running TaskExecutor() for managed-node3/TASK: Assert ipv6 addresses are correctly set [12b410aa-8751-04b9-2e74-000000000060] 34886 1727204505.23860: sending task result for task 12b410aa-8751-04b9-2e74-000000000060 34886 1727204505.23955: done sending task result for task 12b410aa-8751-04b9-2e74-000000000060 34886 1727204505.23958: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 34886 1727204505.24018: no more pending results, returning what we have 34886 1727204505.24025: results queue empty 34886 1727204505.24026: checking for any_errors_fatal 34886 1727204505.24034: done checking for any_errors_fatal 34886 1727204505.24035: checking for max_fail_percentage 34886 1727204505.24037: done checking for max_fail_percentage 34886 1727204505.24039: checking to see if all hosts have failed and the running result is not ok 34886 1727204505.24040: done checking to see if all hosts have failed 34886 1727204505.24041: getting the remaining hosts for this loop 34886 1727204505.24042: done getting the remaining hosts for this loop 34886 1727204505.24047: getting the next task for host managed-node3 34886 1727204505.24053: done getting next task for host managed-node3 34886 1727204505.24055: ^ task is: TASK: Get ipv6 routes 34886 1727204505.24057: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204505.24060: getting variables 34886 1727204505.24062: in VariableManager get_vars() 34886 1727204505.24105: Calling all_inventory to load vars for managed-node3 34886 1727204505.24108: Calling groups_inventory to load vars for managed-node3 34886 1727204505.24111: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204505.24124: Calling all_plugins_play to load vars for managed-node3 34886 1727204505.24127: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204505.24131: Calling groups_plugins_play to load vars for managed-node3 34886 1727204505.25331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204505.26899: done with get_vars() 34886 1727204505.26925: done getting variables 34886 1727204505.26974: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.051) 0:00:23.437 ***** 34886 1727204505.27001: entering _queue_task() for managed-node3/command 34886 1727204505.27259: worker is 1 (out of 1 available) 34886 1727204505.27276: exiting _queue_task() for managed-node3/command 34886 1727204505.27292: done queuing things up, now waiting for results queue to drain 34886 1727204505.27294: waiting for pending results... 34886 1727204505.27490: running TaskExecutor() for managed-node3/TASK: Get ipv6 routes 34886 1727204505.27567: in run() - task 12b410aa-8751-04b9-2e74-000000000061 34886 1727204505.27580: variable 'ansible_search_path' from source: unknown 34886 1727204505.27615: calling self._execute() 34886 1727204505.27704: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.27709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.27721: variable 'omit' from source: magic vars 34886 1727204505.28051: variable 'ansible_distribution_major_version' from source: facts 34886 1727204505.28063: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204505.28074: variable 'omit' from source: magic vars 34886 1727204505.28088: variable 'omit' from source: magic vars 34886 1727204505.28120: variable 'omit' from source: magic vars 34886 1727204505.28158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204505.28200: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204505.28217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204505.28237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.28248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.28276: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204505.28279: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.28289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.28374: Set connection var ansible_timeout to 10 34886 1727204505.28380: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204505.28383: Set connection var ansible_connection to ssh 34886 1727204505.28392: Set connection var ansible_shell_executable to /bin/sh 34886 1727204505.28405: Set connection var ansible_pipelining to False 34886 1727204505.28408: Set connection var ansible_shell_type to sh 34886 1727204505.28431: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.28435: variable 'ansible_connection' from source: unknown 34886 1727204505.28437: variable 'ansible_module_compression' from source: unknown 34886 1727204505.28442: variable 'ansible_shell_type' from source: unknown 34886 1727204505.28445: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.28450: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.28455: variable 'ansible_pipelining' from source: unknown 34886 1727204505.28458: variable 'ansible_timeout' from source: unknown 34886 1727204505.28464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.28585: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204505.28597: variable 'omit' from source: magic vars 34886 1727204505.28603: starting attempt loop 34886 1727204505.28606: running the handler 34886 1727204505.28629: _low_level_execute_command(): starting 34886 1727204505.28636: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204505.29186: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204505.29199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.29206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204505.29210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.29258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.29261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204505.29265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.29318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.31037: stdout chunk (state=3): >>>/root <<< 34886 1727204505.31151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.31197: stderr chunk (state=3): >>><<< 34886 1727204505.31200: stdout chunk (state=3): >>><<< 34886 1727204505.31221: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204505.31238: _low_level_execute_command(): starting 34886 1727204505.31242: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465 `" && echo ansible-tmp-1727204505.3122337-36153-169550275615465="` echo /root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465 `" ) && sleep 0' 34886 1727204505.31689: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204505.31693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.31697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204505.31707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.31755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.31763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.31801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.33775: stdout chunk (state=3): >>>ansible-tmp-1727204505.3122337-36153-169550275615465=/root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465 <<< 34886 1727204505.33886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.33930: stderr chunk (state=3): >>><<< 34886 1727204505.33934: stdout chunk (state=3): >>><<< 34886 1727204505.33949: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204505.3122337-36153-169550275615465=/root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204505.33979: variable 'ansible_module_compression' from source: unknown 34886 1727204505.34091: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204505.34096: variable 'ansible_facts' from source: unknown 34886 1727204505.34126: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/AnsiballZ_command.py 34886 1727204505.34237: Sending initial data 34886 1727204505.34241: Sent initial data (156 bytes) 34886 1727204505.34687: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204505.34696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204505.34699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204505.34702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.34761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204505.34766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.34798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.36388: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204505.36395: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204505.36416: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204505.36448: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpy6q5t0i7 /root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/AnsiballZ_command.py <<< 34886 1727204505.36462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/AnsiballZ_command.py" <<< 34886 1727204505.36486: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 34886 1727204505.36492: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpy6q5t0i7" to remote "/root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/AnsiballZ_command.py" <<< 34886 1727204505.37254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.37317: stderr chunk (state=3): >>><<< 34886 1727204505.37320: stdout chunk (state=3): >>><<< 34886 1727204505.37341: done transferring module to remote 34886 1727204505.37353: _low_level_execute_command(): starting 34886 1727204505.37358: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/ /root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/AnsiballZ_command.py && sleep 0' 34886 1727204505.37807: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204505.37810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.37812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204505.37815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.37870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.37879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.37913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.39720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.39768: stderr chunk (state=3): >>><<< 34886 1727204505.39771: stdout chunk (state=3): >>><<< 34886 1727204505.39786: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204505.39791: _low_level_execute_command(): starting 34886 1727204505.39797: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/AnsiballZ_command.py && sleep 0' 34886 1727204505.40238: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204505.40241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.40244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204505.40246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204505.40248: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.40296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.40306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.40346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.57943: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 15:01:45.574470", "end": "2024-09-24 15:01:45.578212", "delta": "0:00:00.003742", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204505.59660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204505.59724: stderr chunk (state=3): >>><<< 34886 1727204505.59728: stdout chunk (state=3): >>><<< 34886 1727204505.59747: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 15:01:45.574470", "end": "2024-09-24 15:01:45.578212", "delta": "0:00:00.003742", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204505.59791: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204505.59800: _low_level_execute_command(): starting 34886 1727204505.59806: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204505.3122337-36153-169550275615465/ > /dev/null 2>&1 && sleep 0' 34886 1727204505.60273: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204505.60318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204505.60321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204505.60324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204505.60333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204505.60335: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.60378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.60382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.60426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.62333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.62382: stderr chunk (state=3): >>><<< 34886 1727204505.62386: stdout chunk (state=3): >>><<< 34886 1727204505.62403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204505.62412: handler run complete 34886 1727204505.62438: Evaluated conditional (False): False 34886 1727204505.62449: attempt loop complete, returning result 34886 1727204505.62452: _execute() done 34886 1727204505.62455: dumping result to json 34886 1727204505.62465: done dumping result, returning 34886 1727204505.62473: done running TaskExecutor() for managed-node3/TASK: Get ipv6 routes [12b410aa-8751-04b9-2e74-000000000061] 34886 1727204505.62482: sending task result for task 12b410aa-8751-04b9-2e74-000000000061 34886 1727204505.62587: done sending task result for task 12b410aa-8751-04b9-2e74-000000000061 34886 1727204505.62590: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003742", "end": "2024-09-24 15:01:45.578212", "rc": 0, "start": "2024-09-24 15:01:45.574470" } STDOUT: 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 34886 1727204505.62685: no more pending results, returning what we have 34886 1727204505.62691: results queue empty 34886 1727204505.62693: checking for any_errors_fatal 34886 1727204505.62708: done checking for any_errors_fatal 34886 1727204505.62709: checking for max_fail_percentage 34886 1727204505.62711: done checking for max_fail_percentage 34886 1727204505.62712: checking to see if all hosts have failed and the running result is not ok 34886 1727204505.62713: done checking to see if all hosts have failed 34886 1727204505.62714: getting the remaining hosts for this loop 34886 1727204505.62715: done getting the remaining hosts for this loop 34886 1727204505.62720: getting the next task for host managed-node3 34886 1727204505.62726: done getting next task for host managed-node3 34886 1727204505.62729: ^ task is: TASK: Show ipv6_route 34886 1727204505.62733: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204505.62737: getting variables 34886 1727204505.62739: in VariableManager get_vars() 34886 1727204505.62788: Calling all_inventory to load vars for managed-node3 34886 1727204505.62890: Calling groups_inventory to load vars for managed-node3 34886 1727204505.62894: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204505.62908: Calling all_plugins_play to load vars for managed-node3 34886 1727204505.62912: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204505.62919: Calling groups_plugins_play to load vars for managed-node3 34886 1727204505.64359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204505.66785: done with get_vars() 34886 1727204505.66827: done getting variables 34886 1727204505.66900: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.399) 0:00:23.837 ***** 34886 1727204505.66932: entering _queue_task() for managed-node3/debug 34886 1727204505.67522: worker is 1 (out of 1 available) 34886 1727204505.67532: exiting _queue_task() for managed-node3/debug 34886 1727204505.67543: done queuing things up, now waiting for results queue to drain 34886 1727204505.67545: waiting for pending results... 34886 1727204505.67675: running TaskExecutor() for managed-node3/TASK: Show ipv6_route 34886 1727204505.67772: in run() - task 12b410aa-8751-04b9-2e74-000000000062 34886 1727204505.67775: variable 'ansible_search_path' from source: unknown 34886 1727204505.67817: calling self._execute() 34886 1727204505.67988: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.67994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.67997: variable 'omit' from source: magic vars 34886 1727204505.68427: variable 'ansible_distribution_major_version' from source: facts 34886 1727204505.68446: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204505.68458: variable 'omit' from source: magic vars 34886 1727204505.68485: variable 'omit' from source: magic vars 34886 1727204505.68542: variable 'omit' from source: magic vars 34886 1727204505.68597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204505.68650: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204505.68749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204505.68754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.68757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.68760: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204505.68770: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.68779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.68912: Set connection var ansible_timeout to 10 34886 1727204505.68927: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204505.68934: Set connection var ansible_connection to ssh 34886 1727204505.68946: Set connection var ansible_shell_executable to /bin/sh 34886 1727204505.68966: Set connection var ansible_pipelining to False 34886 1727204505.68976: Set connection var ansible_shell_type to sh 34886 1727204505.69014: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.69025: variable 'ansible_connection' from source: unknown 34886 1727204505.69035: variable 'ansible_module_compression' from source: unknown 34886 1727204505.69044: variable 'ansible_shell_type' from source: unknown 34886 1727204505.69076: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.69079: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.69081: variable 'ansible_pipelining' from source: unknown 34886 1727204505.69083: variable 'ansible_timeout' from source: unknown 34886 1727204505.69085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.69250: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204505.69294: variable 'omit' from source: magic vars 34886 1727204505.69297: starting attempt loop 34886 1727204505.69301: running the handler 34886 1727204505.69450: variable 'ipv6_route' from source: set_fact 34886 1727204505.69476: handler run complete 34886 1727204505.69514: attempt loop complete, returning result 34886 1727204505.69517: _execute() done 34886 1727204505.69594: dumping result to json 34886 1727204505.69598: done dumping result, returning 34886 1727204505.69601: done running TaskExecutor() for managed-node3/TASK: Show ipv6_route [12b410aa-8751-04b9-2e74-000000000062] 34886 1727204505.69603: sending task result for task 12b410aa-8751-04b9-2e74-000000000062 34886 1727204505.69895: done sending task result for task 12b410aa-8751-04b9-2e74-000000000062 34886 1727204505.69899: WORKER PROCESS EXITING ok: [managed-node3] => { "ipv6_route.stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 34886 1727204505.69946: no more pending results, returning what we have 34886 1727204505.69950: results queue empty 34886 1727204505.69952: checking for any_errors_fatal 34886 1727204505.69960: done checking for any_errors_fatal 34886 1727204505.69962: checking for max_fail_percentage 34886 1727204505.69964: done checking for max_fail_percentage 34886 1727204505.69965: checking to see if all hosts have failed and the running result is not ok 34886 1727204505.69966: done checking to see if all hosts have failed 34886 1727204505.69967: getting the remaining hosts for this loop 34886 1727204505.69969: done getting the remaining hosts for this loop 34886 1727204505.69973: getting the next task for host managed-node3 34886 1727204505.69979: done getting next task for host managed-node3 34886 1727204505.69982: ^ task is: TASK: Assert default ipv6 route is set 34886 1727204505.69985: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204505.69988: getting variables 34886 1727204505.69993: in VariableManager get_vars() 34886 1727204505.70038: Calling all_inventory to load vars for managed-node3 34886 1727204505.70042: Calling groups_inventory to load vars for managed-node3 34886 1727204505.70045: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204505.70058: Calling all_plugins_play to load vars for managed-node3 34886 1727204505.70062: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204505.70065: Calling groups_plugins_play to load vars for managed-node3 34886 1727204505.71612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204505.73691: done with get_vars() 34886 1727204505.73725: done getting variables 34886 1727204505.73799: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.068) 0:00:23.906 ***** 34886 1727204505.73834: entering _queue_task() for managed-node3/assert 34886 1727204505.74424: worker is 1 (out of 1 available) 34886 1727204505.74434: exiting _queue_task() for managed-node3/assert 34886 1727204505.74445: done queuing things up, now waiting for results queue to drain 34886 1727204505.74447: waiting for pending results... 34886 1727204505.74525: running TaskExecutor() for managed-node3/TASK: Assert default ipv6 route is set 34886 1727204505.74640: in run() - task 12b410aa-8751-04b9-2e74-000000000063 34886 1727204505.74668: variable 'ansible_search_path' from source: unknown 34886 1727204505.74718: calling self._execute() 34886 1727204505.74834: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.74848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.74868: variable 'omit' from source: magic vars 34886 1727204505.75313: variable 'ansible_distribution_major_version' from source: facts 34886 1727204505.75338: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204505.75350: variable 'omit' from source: magic vars 34886 1727204505.75375: variable 'omit' from source: magic vars 34886 1727204505.75425: variable 'omit' from source: magic vars 34886 1727204505.75479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204505.75528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204505.75562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204505.75588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.75608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.75647: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204505.75662: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.75671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.75875: Set connection var ansible_timeout to 10 34886 1727204505.75879: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204505.75882: Set connection var ansible_connection to ssh 34886 1727204505.75884: Set connection var ansible_shell_executable to /bin/sh 34886 1727204505.75886: Set connection var ansible_pipelining to False 34886 1727204505.75888: Set connection var ansible_shell_type to sh 34886 1727204505.75892: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.75894: variable 'ansible_connection' from source: unknown 34886 1727204505.75897: variable 'ansible_module_compression' from source: unknown 34886 1727204505.75904: variable 'ansible_shell_type' from source: unknown 34886 1727204505.75911: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.75919: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.75928: variable 'ansible_pipelining' from source: unknown 34886 1727204505.75935: variable 'ansible_timeout' from source: unknown 34886 1727204505.75944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.76122: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204505.76142: variable 'omit' from source: magic vars 34886 1727204505.76153: starting attempt loop 34886 1727204505.76160: running the handler 34886 1727204505.76357: variable '__test_str' from source: task vars 34886 1727204505.76495: variable 'interface' from source: play vars 34886 1727204505.76498: variable 'ipv6_route' from source: set_fact 34886 1727204505.76501: Evaluated conditional (__test_str in ipv6_route.stdout): True 34886 1727204505.76503: handler run complete 34886 1727204505.76520: attempt loop complete, returning result 34886 1727204505.76532: _execute() done 34886 1727204505.76539: dumping result to json 34886 1727204505.76547: done dumping result, returning 34886 1727204505.76558: done running TaskExecutor() for managed-node3/TASK: Assert default ipv6 route is set [12b410aa-8751-04b9-2e74-000000000063] 34886 1727204505.76569: sending task result for task 12b410aa-8751-04b9-2e74-000000000063 34886 1727204505.76841: done sending task result for task 12b410aa-8751-04b9-2e74-000000000063 34886 1727204505.76844: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 34886 1727204505.76898: no more pending results, returning what we have 34886 1727204505.76902: results queue empty 34886 1727204505.76904: checking for any_errors_fatal 34886 1727204505.76909: done checking for any_errors_fatal 34886 1727204505.76910: checking for max_fail_percentage 34886 1727204505.76912: done checking for max_fail_percentage 34886 1727204505.76914: checking to see if all hosts have failed and the running result is not ok 34886 1727204505.76915: done checking to see if all hosts have failed 34886 1727204505.76916: getting the remaining hosts for this loop 34886 1727204505.76917: done getting the remaining hosts for this loop 34886 1727204505.76922: getting the next task for host managed-node3 34886 1727204505.76927: done getting next task for host managed-node3 34886 1727204505.76931: ^ task is: TASK: Ensure ping6 command is present 34886 1727204505.76934: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204505.76937: getting variables 34886 1727204505.76939: in VariableManager get_vars() 34886 1727204505.76985: Calling all_inventory to load vars for managed-node3 34886 1727204505.76990: Calling groups_inventory to load vars for managed-node3 34886 1727204505.76993: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204505.77006: Calling all_plugins_play to load vars for managed-node3 34886 1727204505.77010: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204505.77014: Calling groups_plugins_play to load vars for managed-node3 34886 1727204505.79316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204505.82241: done with get_vars() 34886 1727204505.82276: done getting variables 34886 1727204505.82348: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Tuesday 24 September 2024 15:01:45 -0400 (0:00:00.085) 0:00:23.991 ***** 34886 1727204505.82382: entering _queue_task() for managed-node3/package 34886 1727204505.82737: worker is 1 (out of 1 available) 34886 1727204505.82752: exiting _queue_task() for managed-node3/package 34886 1727204505.82766: done queuing things up, now waiting for results queue to drain 34886 1727204505.82768: waiting for pending results... 34886 1727204505.83066: running TaskExecutor() for managed-node3/TASK: Ensure ping6 command is present 34886 1727204505.83216: in run() - task 12b410aa-8751-04b9-2e74-000000000064 34886 1727204505.83220: variable 'ansible_search_path' from source: unknown 34886 1727204505.83258: calling self._execute() 34886 1727204505.83374: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.83594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.83598: variable 'omit' from source: magic vars 34886 1727204505.83842: variable 'ansible_distribution_major_version' from source: facts 34886 1727204505.83860: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204505.83872: variable 'omit' from source: magic vars 34886 1727204505.83900: variable 'omit' from source: magic vars 34886 1727204505.84154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204505.86977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204505.87061: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204505.87114: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204505.87173: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204505.87215: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204505.87334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204505.87372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204505.87416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204505.87473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204505.87496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204505.87628: variable '__network_is_ostree' from source: set_fact 34886 1727204505.87639: variable 'omit' from source: magic vars 34886 1727204505.87677: variable 'omit' from source: magic vars 34886 1727204505.87714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204505.87757: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204505.87782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204505.87810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.87828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204505.87871: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204505.87880: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.87950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.88036: Set connection var ansible_timeout to 10 34886 1727204505.88049: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204505.88061: Set connection var ansible_connection to ssh 34886 1727204505.88073: Set connection var ansible_shell_executable to /bin/sh 34886 1727204505.88088: Set connection var ansible_pipelining to False 34886 1727204505.88097: Set connection var ansible_shell_type to sh 34886 1727204505.88132: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.88141: variable 'ansible_connection' from source: unknown 34886 1727204505.88149: variable 'ansible_module_compression' from source: unknown 34886 1727204505.88156: variable 'ansible_shell_type' from source: unknown 34886 1727204505.88171: variable 'ansible_shell_executable' from source: unknown 34886 1727204505.88181: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204505.88275: variable 'ansible_pipelining' from source: unknown 34886 1727204505.88278: variable 'ansible_timeout' from source: unknown 34886 1727204505.88281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204505.88346: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204505.88366: variable 'omit' from source: magic vars 34886 1727204505.88379: starting attempt loop 34886 1727204505.88494: running the handler 34886 1727204505.88500: variable 'ansible_facts' from source: unknown 34886 1727204505.88503: variable 'ansible_facts' from source: unknown 34886 1727204505.88505: _low_level_execute_command(): starting 34886 1727204505.88508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204505.89207: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204505.89268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.89340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.89357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204505.89384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.89462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.91257: stdout chunk (state=3): >>>/root <<< 34886 1727204505.91447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.91451: stdout chunk (state=3): >>><<< 34886 1727204505.91453: stderr chunk (state=3): >>><<< 34886 1727204505.91473: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204505.91588: _low_level_execute_command(): starting 34886 1727204505.91596: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013 `" && echo ansible-tmp-1727204505.914875-36164-257455646913013="` echo /root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013 `" ) && sleep 0' 34886 1727204505.92202: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204505.92205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204505.92208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204505.92211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204505.92213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204505.92312: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.92329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.92348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204505.92352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.92426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.94435: stdout chunk (state=3): >>>ansible-tmp-1727204505.914875-36164-257455646913013=/root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013 <<< 34886 1727204505.94641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.94645: stdout chunk (state=3): >>><<< 34886 1727204505.94647: stderr chunk (state=3): >>><<< 34886 1727204505.94665: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204505.914875-36164-257455646913013=/root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204505.94723: variable 'ansible_module_compression' from source: unknown 34886 1727204505.94825: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 34886 1727204505.94860: variable 'ansible_facts' from source: unknown 34886 1727204505.94985: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/AnsiballZ_dnf.py 34886 1727204505.95279: Sending initial data 34886 1727204505.95283: Sent initial data (151 bytes) 34886 1727204505.95907: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204505.95964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204505.95983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204505.96025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204505.96075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204505.97725: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204505.97792: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204505.97826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpyjagqp97 /root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/AnsiballZ_dnf.py <<< 34886 1727204505.97829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/AnsiballZ_dnf.py" <<< 34886 1727204505.97860: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpyjagqp97" to remote "/root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/AnsiballZ_dnf.py" <<< 34886 1727204505.99428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204505.99432: stderr chunk (state=3): >>><<< 34886 1727204505.99436: stdout chunk (state=3): >>><<< 34886 1727204505.99438: done transferring module to remote 34886 1727204505.99440: _low_level_execute_command(): starting 34886 1727204505.99442: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/ /root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/AnsiballZ_dnf.py && sleep 0' 34886 1727204506.00171: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204506.00178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204506.00181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204506.00184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204506.00186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204506.00261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204506.00266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204506.00336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204506.02295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204506.02299: stdout chunk (state=3): >>><<< 34886 1727204506.02301: stderr chunk (state=3): >>><<< 34886 1727204506.02304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204506.02307: _low_level_execute_command(): starting 34886 1727204506.02309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/AnsiballZ_dnf.py && sleep 0' 34886 1727204506.02883: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204506.02894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204506.02967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 34886 1727204506.02992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204506.03005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204506.03073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204507.49757: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 34886 1727204507.54884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204507.54913: stderr chunk (state=3): >>><<< 34886 1727204507.54919: stdout chunk (state=3): >>><<< 34886 1727204507.54941: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204507.54984: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204507.54996: _low_level_execute_command(): starting 34886 1727204507.55002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204505.914875-36164-257455646913013/ > /dev/null 2>&1 && sleep 0' 34886 1727204507.55466: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204507.55512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204507.55516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.55523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204507.55525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204507.55527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.55574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204507.55581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204507.55583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204507.55624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204507.57539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204507.57590: stderr chunk (state=3): >>><<< 34886 1727204507.57594: stdout chunk (state=3): >>><<< 34886 1727204507.57609: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204507.57616: handler run complete 34886 1727204507.57659: attempt loop complete, returning result 34886 1727204507.57662: _execute() done 34886 1727204507.57665: dumping result to json 34886 1727204507.57672: done dumping result, returning 34886 1727204507.57682: done running TaskExecutor() for managed-node3/TASK: Ensure ping6 command is present [12b410aa-8751-04b9-2e74-000000000064] 34886 1727204507.57690: sending task result for task 12b410aa-8751-04b9-2e74-000000000064 34886 1727204507.57795: done sending task result for task 12b410aa-8751-04b9-2e74-000000000064 34886 1727204507.57799: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 34886 1727204507.57894: no more pending results, returning what we have 34886 1727204507.57898: results queue empty 34886 1727204507.57900: checking for any_errors_fatal 34886 1727204507.57914: done checking for any_errors_fatal 34886 1727204507.57915: checking for max_fail_percentage 34886 1727204507.57917: done checking for max_fail_percentage 34886 1727204507.57918: checking to see if all hosts have failed and the running result is not ok 34886 1727204507.57919: done checking to see if all hosts have failed 34886 1727204507.57920: getting the remaining hosts for this loop 34886 1727204507.57922: done getting the remaining hosts for this loop 34886 1727204507.57927: getting the next task for host managed-node3 34886 1727204507.57932: done getting next task for host managed-node3 34886 1727204507.57936: ^ task is: TASK: Test gateway can be pinged 34886 1727204507.57938: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204507.57942: getting variables 34886 1727204507.57944: in VariableManager get_vars() 34886 1727204507.57988: Calling all_inventory to load vars for managed-node3 34886 1727204507.57995: Calling groups_inventory to load vars for managed-node3 34886 1727204507.57997: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204507.58009: Calling all_plugins_play to load vars for managed-node3 34886 1727204507.58012: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204507.58024: Calling groups_plugins_play to load vars for managed-node3 34886 1727204507.59432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204507.60982: done with get_vars() 34886 1727204507.61008: done getting variables 34886 1727204507.61059: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Tuesday 24 September 2024 15:01:47 -0400 (0:00:01.786) 0:00:25.778 ***** 34886 1727204507.61083: entering _queue_task() for managed-node3/command 34886 1727204507.61347: worker is 1 (out of 1 available) 34886 1727204507.61362: exiting _queue_task() for managed-node3/command 34886 1727204507.61377: done queuing things up, now waiting for results queue to drain 34886 1727204507.61379: waiting for pending results... 34886 1727204507.61576: running TaskExecutor() for managed-node3/TASK: Test gateway can be pinged 34886 1727204507.61652: in run() - task 12b410aa-8751-04b9-2e74-000000000065 34886 1727204507.61666: variable 'ansible_search_path' from source: unknown 34886 1727204507.61701: calling self._execute() 34886 1727204507.61788: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204507.61797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204507.61808: variable 'omit' from source: magic vars 34886 1727204507.62149: variable 'ansible_distribution_major_version' from source: facts 34886 1727204507.62162: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204507.62166: variable 'omit' from source: magic vars 34886 1727204507.62184: variable 'omit' from source: magic vars 34886 1727204507.62218: variable 'omit' from source: magic vars 34886 1727204507.62257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204507.62293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204507.62313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204507.62331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204507.62342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204507.62373: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204507.62376: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204507.62379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204507.62469: Set connection var ansible_timeout to 10 34886 1727204507.62475: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204507.62478: Set connection var ansible_connection to ssh 34886 1727204507.62485: Set connection var ansible_shell_executable to /bin/sh 34886 1727204507.62496: Set connection var ansible_pipelining to False 34886 1727204507.62501: Set connection var ansible_shell_type to sh 34886 1727204507.62525: variable 'ansible_shell_executable' from source: unknown 34886 1727204507.62529: variable 'ansible_connection' from source: unknown 34886 1727204507.62532: variable 'ansible_module_compression' from source: unknown 34886 1727204507.62534: variable 'ansible_shell_type' from source: unknown 34886 1727204507.62539: variable 'ansible_shell_executable' from source: unknown 34886 1727204507.62543: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204507.62548: variable 'ansible_pipelining' from source: unknown 34886 1727204507.62552: variable 'ansible_timeout' from source: unknown 34886 1727204507.62557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204507.62681: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204507.62693: variable 'omit' from source: magic vars 34886 1727204507.62699: starting attempt loop 34886 1727204507.62702: running the handler 34886 1727204507.62722: _low_level_execute_command(): starting 34886 1727204507.62733: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204507.63284: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204507.63287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204507.63301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.63353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204507.63361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204507.63363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204507.63408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204507.65100: stdout chunk (state=3): >>>/root <<< 34886 1727204507.65208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204507.65263: stderr chunk (state=3): >>><<< 34886 1727204507.65267: stdout chunk (state=3): >>><<< 34886 1727204507.65286: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204507.65300: _low_level_execute_command(): starting 34886 1727204507.65307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767 `" && echo ansible-tmp-1727204507.6528687-36201-228214828817767="` echo /root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767 `" ) && sleep 0' 34886 1727204507.65763: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204507.65767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.65770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204507.65782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.65830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204507.65835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204507.65878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204507.67860: stdout chunk (state=3): >>>ansible-tmp-1727204507.6528687-36201-228214828817767=/root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767 <<< 34886 1727204507.67979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204507.68030: stderr chunk (state=3): >>><<< 34886 1727204507.68033: stdout chunk (state=3): >>><<< 34886 1727204507.68054: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204507.6528687-36201-228214828817767=/root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204507.68082: variable 'ansible_module_compression' from source: unknown 34886 1727204507.68128: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204507.68161: variable 'ansible_facts' from source: unknown 34886 1727204507.68230: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/AnsiballZ_command.py 34886 1727204507.68346: Sending initial data 34886 1727204507.68350: Sent initial data (156 bytes) 34886 1727204507.68818: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204507.68823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204507.68826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204507.68828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.68883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204507.68890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204507.68927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204507.70524: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34886 1727204507.70527: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204507.70558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204507.70592: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp833zendv /root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/AnsiballZ_command.py <<< 34886 1727204507.70596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/AnsiballZ_command.py" <<< 34886 1727204507.70620: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp833zendv" to remote "/root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/AnsiballZ_command.py" <<< 34886 1727204507.71377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204507.71449: stderr chunk (state=3): >>><<< 34886 1727204507.71452: stdout chunk (state=3): >>><<< 34886 1727204507.71473: done transferring module to remote 34886 1727204507.71483: _low_level_execute_command(): starting 34886 1727204507.71490: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/ /root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/AnsiballZ_command.py && sleep 0' 34886 1727204507.71956: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204507.71960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204507.71963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.71965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204507.71968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.72021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204507.72029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204507.72067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204507.73882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204507.73935: stderr chunk (state=3): >>><<< 34886 1727204507.73939: stdout chunk (state=3): >>><<< 34886 1727204507.73955: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204507.73958: _low_level_execute_command(): starting 34886 1727204507.73964: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/AnsiballZ_command.py && sleep 0' 34886 1727204507.74429: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204507.74433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204507.74435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204507.74438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204507.74440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204507.74496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204507.74499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204507.74537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204508.00582: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1(2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.055 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.055/0.055/0.055/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-24 15:01:47.914889", "end": "2024-09-24 15:01:48.004537", "delta": "0:00:00.089648", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204508.02243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204508.02304: stderr chunk (state=3): >>><<< 34886 1727204508.02308: stdout chunk (state=3): >>><<< 34886 1727204508.02331: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1(2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.055 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.055/0.055/0.055/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-24 15:01:47.914889", "end": "2024-09-24 15:01:48.004537", "delta": "0:00:00.089648", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204508.02373: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204508.02384: _low_level_execute_command(): starting 34886 1727204508.02392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204507.6528687-36201-228214828817767/ > /dev/null 2>&1 && sleep 0' 34886 1727204508.02882: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.02885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.02891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204508.02899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.02949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204508.02953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204508.02996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204508.04885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204508.04935: stderr chunk (state=3): >>><<< 34886 1727204508.04939: stdout chunk (state=3): >>><<< 34886 1727204508.04959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204508.04966: handler run complete 34886 1727204508.04988: Evaluated conditional (False): False 34886 1727204508.05001: attempt loop complete, returning result 34886 1727204508.05004: _execute() done 34886 1727204508.05006: dumping result to json 34886 1727204508.05013: done dumping result, returning 34886 1727204508.05024: done running TaskExecutor() for managed-node3/TASK: Test gateway can be pinged [12b410aa-8751-04b9-2e74-000000000065] 34886 1727204508.05029: sending task result for task 12b410aa-8751-04b9-2e74-000000000065 34886 1727204508.05142: done sending task result for task 12b410aa-8751-04b9-2e74-000000000065 34886 1727204508.05145: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.089648", "end": "2024-09-24 15:01:48.004537", "rc": 0, "start": "2024-09-24 15:01:47.914889" } STDOUT: PING 2001:db8::1(2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.055 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.055/0.055/0.055/0.000 ms 34886 1727204508.05242: no more pending results, returning what we have 34886 1727204508.05246: results queue empty 34886 1727204508.05247: checking for any_errors_fatal 34886 1727204508.05257: done checking for any_errors_fatal 34886 1727204508.05258: checking for max_fail_percentage 34886 1727204508.05260: done checking for max_fail_percentage 34886 1727204508.05261: checking to see if all hosts have failed and the running result is not ok 34886 1727204508.05262: done checking to see if all hosts have failed 34886 1727204508.05263: getting the remaining hosts for this loop 34886 1727204508.05264: done getting the remaining hosts for this loop 34886 1727204508.05269: getting the next task for host managed-node3 34886 1727204508.05276: done getting next task for host managed-node3 34886 1727204508.05279: ^ task is: TASK: TEARDOWN: remove profiles. 34886 1727204508.05282: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204508.05285: getting variables 34886 1727204508.05286: in VariableManager get_vars() 34886 1727204508.05332: Calling all_inventory to load vars for managed-node3 34886 1727204508.05335: Calling groups_inventory to load vars for managed-node3 34886 1727204508.05338: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204508.05349: Calling all_plugins_play to load vars for managed-node3 34886 1727204508.05352: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204508.05356: Calling groups_plugins_play to load vars for managed-node3 34886 1727204508.06610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204508.08275: done with get_vars() 34886 1727204508.08300: done getting variables 34886 1727204508.08357: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.472) 0:00:26.251 ***** 34886 1727204508.08380: entering _queue_task() for managed-node3/debug 34886 1727204508.08624: worker is 1 (out of 1 available) 34886 1727204508.08638: exiting _queue_task() for managed-node3/debug 34886 1727204508.08651: done queuing things up, now waiting for results queue to drain 34886 1727204508.08653: waiting for pending results... 34886 1727204508.08841: running TaskExecutor() for managed-node3/TASK: TEARDOWN: remove profiles. 34886 1727204508.08904: in run() - task 12b410aa-8751-04b9-2e74-000000000066 34886 1727204508.08922: variable 'ansible_search_path' from source: unknown 34886 1727204508.08957: calling self._execute() 34886 1727204508.09048: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.09055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.09066: variable 'omit' from source: magic vars 34886 1727204508.09393: variable 'ansible_distribution_major_version' from source: facts 34886 1727204508.09404: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204508.09410: variable 'omit' from source: magic vars 34886 1727204508.09432: variable 'omit' from source: magic vars 34886 1727204508.09467: variable 'omit' from source: magic vars 34886 1727204508.09504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204508.09538: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204508.09556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204508.09574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204508.09585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204508.09615: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204508.09621: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.09624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.09711: Set connection var ansible_timeout to 10 34886 1727204508.09717: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204508.09722: Set connection var ansible_connection to ssh 34886 1727204508.09728: Set connection var ansible_shell_executable to /bin/sh 34886 1727204508.09737: Set connection var ansible_pipelining to False 34886 1727204508.09739: Set connection var ansible_shell_type to sh 34886 1727204508.09763: variable 'ansible_shell_executable' from source: unknown 34886 1727204508.09767: variable 'ansible_connection' from source: unknown 34886 1727204508.09770: variable 'ansible_module_compression' from source: unknown 34886 1727204508.09773: variable 'ansible_shell_type' from source: unknown 34886 1727204508.09778: variable 'ansible_shell_executable' from source: unknown 34886 1727204508.09783: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.09789: variable 'ansible_pipelining' from source: unknown 34886 1727204508.09794: variable 'ansible_timeout' from source: unknown 34886 1727204508.09800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.09923: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204508.09933: variable 'omit' from source: magic vars 34886 1727204508.09939: starting attempt loop 34886 1727204508.09942: running the handler 34886 1727204508.09987: handler run complete 34886 1727204508.10007: attempt loop complete, returning result 34886 1727204508.10011: _execute() done 34886 1727204508.10014: dumping result to json 34886 1727204508.10016: done dumping result, returning 34886 1727204508.10025: done running TaskExecutor() for managed-node3/TASK: TEARDOWN: remove profiles. [12b410aa-8751-04b9-2e74-000000000066] 34886 1727204508.10031: sending task result for task 12b410aa-8751-04b9-2e74-000000000066 34886 1727204508.10126: done sending task result for task 12b410aa-8751-04b9-2e74-000000000066 34886 1727204508.10129: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 34886 1727204508.10181: no more pending results, returning what we have 34886 1727204508.10185: results queue empty 34886 1727204508.10187: checking for any_errors_fatal 34886 1727204508.10197: done checking for any_errors_fatal 34886 1727204508.10198: checking for max_fail_percentage 34886 1727204508.10200: done checking for max_fail_percentage 34886 1727204508.10201: checking to see if all hosts have failed and the running result is not ok 34886 1727204508.10202: done checking to see if all hosts have failed 34886 1727204508.10203: getting the remaining hosts for this loop 34886 1727204508.10205: done getting the remaining hosts for this loop 34886 1727204508.10210: getting the next task for host managed-node3 34886 1727204508.10217: done getting next task for host managed-node3 34886 1727204508.10225: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34886 1727204508.10228: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204508.10247: getting variables 34886 1727204508.10249: in VariableManager get_vars() 34886 1727204508.10287: Calling all_inventory to load vars for managed-node3 34886 1727204508.10298: Calling groups_inventory to load vars for managed-node3 34886 1727204508.10301: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204508.10311: Calling all_plugins_play to load vars for managed-node3 34886 1727204508.10315: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204508.10318: Calling groups_plugins_play to load vars for managed-node3 34886 1727204508.11514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204508.13095: done with get_vars() 34886 1727204508.13123: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.048) 0:00:26.299 ***** 34886 1727204508.13207: entering _queue_task() for managed-node3/include_tasks 34886 1727204508.13465: worker is 1 (out of 1 available) 34886 1727204508.13481: exiting _queue_task() for managed-node3/include_tasks 34886 1727204508.13497: done queuing things up, now waiting for results queue to drain 34886 1727204508.13500: waiting for pending results... 34886 1727204508.13688: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34886 1727204508.13802: in run() - task 12b410aa-8751-04b9-2e74-00000000006e 34886 1727204508.13816: variable 'ansible_search_path' from source: unknown 34886 1727204508.13822: variable 'ansible_search_path' from source: unknown 34886 1727204508.13855: calling self._execute() 34886 1727204508.13940: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.13945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.13959: variable 'omit' from source: magic vars 34886 1727204508.14279: variable 'ansible_distribution_major_version' from source: facts 34886 1727204508.14293: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204508.14300: _execute() done 34886 1727204508.14303: dumping result to json 34886 1727204508.14309: done dumping result, returning 34886 1727204508.14316: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-04b9-2e74-00000000006e] 34886 1727204508.14396: sending task result for task 12b410aa-8751-04b9-2e74-00000000006e 34886 1727204508.14472: done sending task result for task 12b410aa-8751-04b9-2e74-00000000006e 34886 1727204508.14475: WORKER PROCESS EXITING 34886 1727204508.14523: no more pending results, returning what we have 34886 1727204508.14527: in VariableManager get_vars() 34886 1727204508.14570: Calling all_inventory to load vars for managed-node3 34886 1727204508.14574: Calling groups_inventory to load vars for managed-node3 34886 1727204508.14576: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204508.14594: Calling all_plugins_play to load vars for managed-node3 34886 1727204508.14600: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204508.14603: Calling groups_plugins_play to load vars for managed-node3 34886 1727204508.15918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204508.17492: done with get_vars() 34886 1727204508.17515: variable 'ansible_search_path' from source: unknown 34886 1727204508.17516: variable 'ansible_search_path' from source: unknown 34886 1727204508.17550: we have included files to process 34886 1727204508.17552: generating all_blocks data 34886 1727204508.17554: done generating all_blocks data 34886 1727204508.17558: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34886 1727204508.17559: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34886 1727204508.17561: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34886 1727204508.18054: done processing included file 34886 1727204508.18056: iterating over new_blocks loaded from include file 34886 1727204508.18057: in VariableManager get_vars() 34886 1727204508.18079: done with get_vars() 34886 1727204508.18080: filtering new block on tags 34886 1727204508.18096: done filtering new block on tags 34886 1727204508.18098: in VariableManager get_vars() 34886 1727204508.18116: done with get_vars() 34886 1727204508.18117: filtering new block on tags 34886 1727204508.18136: done filtering new block on tags 34886 1727204508.18138: in VariableManager get_vars() 34886 1727204508.18157: done with get_vars() 34886 1727204508.18158: filtering new block on tags 34886 1727204508.18176: done filtering new block on tags 34886 1727204508.18178: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 34886 1727204508.18182: extending task lists for all hosts with included blocks 34886 1727204508.18824: done extending task lists 34886 1727204508.18825: done processing included files 34886 1727204508.18826: results queue empty 34886 1727204508.18826: checking for any_errors_fatal 34886 1727204508.18829: done checking for any_errors_fatal 34886 1727204508.18829: checking for max_fail_percentage 34886 1727204508.18830: done checking for max_fail_percentage 34886 1727204508.18831: checking to see if all hosts have failed and the running result is not ok 34886 1727204508.18831: done checking to see if all hosts have failed 34886 1727204508.18832: getting the remaining hosts for this loop 34886 1727204508.18833: done getting the remaining hosts for this loop 34886 1727204508.18835: getting the next task for host managed-node3 34886 1727204508.18838: done getting next task for host managed-node3 34886 1727204508.18841: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34886 1727204508.18843: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204508.18850: getting variables 34886 1727204508.18851: in VariableManager get_vars() 34886 1727204508.18863: Calling all_inventory to load vars for managed-node3 34886 1727204508.18865: Calling groups_inventory to load vars for managed-node3 34886 1727204508.18867: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204508.18871: Calling all_plugins_play to load vars for managed-node3 34886 1727204508.18873: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204508.18875: Calling groups_plugins_play to load vars for managed-node3 34886 1727204508.20035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204508.24966: done with get_vars() 34886 1727204508.24988: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.118) 0:00:26.418 ***** 34886 1727204508.25058: entering _queue_task() for managed-node3/setup 34886 1727204508.25342: worker is 1 (out of 1 available) 34886 1727204508.25357: exiting _queue_task() for managed-node3/setup 34886 1727204508.25370: done queuing things up, now waiting for results queue to drain 34886 1727204508.25372: waiting for pending results... 34886 1727204508.25561: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34886 1727204508.25895: in run() - task 12b410aa-8751-04b9-2e74-000000000513 34886 1727204508.25900: variable 'ansible_search_path' from source: unknown 34886 1727204508.25903: variable 'ansible_search_path' from source: unknown 34886 1727204508.25907: calling self._execute() 34886 1727204508.25956: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.25971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.25992: variable 'omit' from source: magic vars 34886 1727204508.26522: variable 'ansible_distribution_major_version' from source: facts 34886 1727204508.26545: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204508.26883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204508.29952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204508.29958: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204508.30003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204508.30051: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204508.30078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204508.30145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204508.30172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204508.30202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204508.30239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204508.30251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204508.30305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204508.30327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204508.30347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204508.30382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204508.30398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204508.30530: variable '__network_required_facts' from source: role '' defaults 34886 1727204508.30538: variable 'ansible_facts' from source: unknown 34886 1727204508.31219: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 34886 1727204508.31223: when evaluation is False, skipping this task 34886 1727204508.31228: _execute() done 34886 1727204508.31232: dumping result to json 34886 1727204508.31236: done dumping result, returning 34886 1727204508.31245: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-04b9-2e74-000000000513] 34886 1727204508.31253: sending task result for task 12b410aa-8751-04b9-2e74-000000000513 34886 1727204508.31347: done sending task result for task 12b410aa-8751-04b9-2e74-000000000513 34886 1727204508.31350: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204508.31420: no more pending results, returning what we have 34886 1727204508.31425: results queue empty 34886 1727204508.31426: checking for any_errors_fatal 34886 1727204508.31428: done checking for any_errors_fatal 34886 1727204508.31429: checking for max_fail_percentage 34886 1727204508.31430: done checking for max_fail_percentage 34886 1727204508.31431: checking to see if all hosts have failed and the running result is not ok 34886 1727204508.31432: done checking to see if all hosts have failed 34886 1727204508.31433: getting the remaining hosts for this loop 34886 1727204508.31434: done getting the remaining hosts for this loop 34886 1727204508.31439: getting the next task for host managed-node3 34886 1727204508.31447: done getting next task for host managed-node3 34886 1727204508.31452: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 34886 1727204508.31456: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204508.31477: getting variables 34886 1727204508.31479: in VariableManager get_vars() 34886 1727204508.31523: Calling all_inventory to load vars for managed-node3 34886 1727204508.31526: Calling groups_inventory to load vars for managed-node3 34886 1727204508.31528: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204508.31539: Calling all_plugins_play to load vars for managed-node3 34886 1727204508.31542: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204508.31546: Calling groups_plugins_play to load vars for managed-node3 34886 1727204508.32773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204508.34362: done with get_vars() 34886 1727204508.34384: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.094) 0:00:26.512 ***** 34886 1727204508.34474: entering _queue_task() for managed-node3/stat 34886 1727204508.34726: worker is 1 (out of 1 available) 34886 1727204508.34741: exiting _queue_task() for managed-node3/stat 34886 1727204508.34754: done queuing things up, now waiting for results queue to drain 34886 1727204508.34756: waiting for pending results... 34886 1727204508.34953: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 34886 1727204508.35077: in run() - task 12b410aa-8751-04b9-2e74-000000000515 34886 1727204508.35100: variable 'ansible_search_path' from source: unknown 34886 1727204508.35104: variable 'ansible_search_path' from source: unknown 34886 1727204508.35136: calling self._execute() 34886 1727204508.35221: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.35229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.35240: variable 'omit' from source: magic vars 34886 1727204508.35569: variable 'ansible_distribution_major_version' from source: facts 34886 1727204508.35581: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204508.35727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204508.35947: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204508.35986: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204508.36045: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204508.36081: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204508.36153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204508.36173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204508.36201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204508.36225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204508.36300: variable '__network_is_ostree' from source: set_fact 34886 1727204508.36306: Evaluated conditional (not __network_is_ostree is defined): False 34886 1727204508.36309: when evaluation is False, skipping this task 34886 1727204508.36312: _execute() done 34886 1727204508.36315: dumping result to json 34886 1727204508.36317: done dumping result, returning 34886 1727204508.36330: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-04b9-2e74-000000000515] 34886 1727204508.36334: sending task result for task 12b410aa-8751-04b9-2e74-000000000515 34886 1727204508.36423: done sending task result for task 12b410aa-8751-04b9-2e74-000000000515 34886 1727204508.36427: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34886 1727204508.36483: no more pending results, returning what we have 34886 1727204508.36488: results queue empty 34886 1727204508.36491: checking for any_errors_fatal 34886 1727204508.36497: done checking for any_errors_fatal 34886 1727204508.36498: checking for max_fail_percentage 34886 1727204508.36500: done checking for max_fail_percentage 34886 1727204508.36501: checking to see if all hosts have failed and the running result is not ok 34886 1727204508.36502: done checking to see if all hosts have failed 34886 1727204508.36503: getting the remaining hosts for this loop 34886 1727204508.36504: done getting the remaining hosts for this loop 34886 1727204508.36509: getting the next task for host managed-node3 34886 1727204508.36515: done getting next task for host managed-node3 34886 1727204508.36520: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34886 1727204508.36524: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204508.36543: getting variables 34886 1727204508.36544: in VariableManager get_vars() 34886 1727204508.36583: Calling all_inventory to load vars for managed-node3 34886 1727204508.36586: Calling groups_inventory to load vars for managed-node3 34886 1727204508.36597: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204508.36608: Calling all_plugins_play to load vars for managed-node3 34886 1727204508.36611: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204508.36614: Calling groups_plugins_play to load vars for managed-node3 34886 1727204508.37948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204508.39509: done with get_vars() 34886 1727204508.39532: done getting variables 34886 1727204508.39580: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.051) 0:00:26.563 ***** 34886 1727204508.39612: entering _queue_task() for managed-node3/set_fact 34886 1727204508.39856: worker is 1 (out of 1 available) 34886 1727204508.39871: exiting _queue_task() for managed-node3/set_fact 34886 1727204508.39885: done queuing things up, now waiting for results queue to drain 34886 1727204508.39887: waiting for pending results... 34886 1727204508.40081: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34886 1727204508.40204: in run() - task 12b410aa-8751-04b9-2e74-000000000516 34886 1727204508.40216: variable 'ansible_search_path' from source: unknown 34886 1727204508.40221: variable 'ansible_search_path' from source: unknown 34886 1727204508.40256: calling self._execute() 34886 1727204508.40345: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.40354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.40365: variable 'omit' from source: magic vars 34886 1727204508.40691: variable 'ansible_distribution_major_version' from source: facts 34886 1727204508.40701: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204508.40850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204508.41071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204508.41109: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204508.41166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204508.41198: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204508.41275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204508.41297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204508.41320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204508.41347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204508.41425: variable '__network_is_ostree' from source: set_fact 34886 1727204508.41438: Evaluated conditional (not __network_is_ostree is defined): False 34886 1727204508.41443: when evaluation is False, skipping this task 34886 1727204508.41446: _execute() done 34886 1727204508.41449: dumping result to json 34886 1727204508.41451: done dumping result, returning 34886 1727204508.41459: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-04b9-2e74-000000000516] 34886 1727204508.41464: sending task result for task 12b410aa-8751-04b9-2e74-000000000516 34886 1727204508.41550: done sending task result for task 12b410aa-8751-04b9-2e74-000000000516 34886 1727204508.41554: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34886 1727204508.41608: no more pending results, returning what we have 34886 1727204508.41613: results queue empty 34886 1727204508.41614: checking for any_errors_fatal 34886 1727204508.41623: done checking for any_errors_fatal 34886 1727204508.41624: checking for max_fail_percentage 34886 1727204508.41625: done checking for max_fail_percentage 34886 1727204508.41626: checking to see if all hosts have failed and the running result is not ok 34886 1727204508.41628: done checking to see if all hosts have failed 34886 1727204508.41629: getting the remaining hosts for this loop 34886 1727204508.41630: done getting the remaining hosts for this loop 34886 1727204508.41634: getting the next task for host managed-node3 34886 1727204508.41644: done getting next task for host managed-node3 34886 1727204508.41648: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 34886 1727204508.41652: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204508.41669: getting variables 34886 1727204508.41671: in VariableManager get_vars() 34886 1727204508.41716: Calling all_inventory to load vars for managed-node3 34886 1727204508.41720: Calling groups_inventory to load vars for managed-node3 34886 1727204508.41722: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204508.41732: Calling all_plugins_play to load vars for managed-node3 34886 1727204508.41735: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204508.41739: Calling groups_plugins_play to load vars for managed-node3 34886 1727204508.42926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204508.44588: done with get_vars() 34886 1727204508.44611: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:01:48 -0400 (0:00:00.050) 0:00:26.614 ***** 34886 1727204508.44688: entering _queue_task() for managed-node3/service_facts 34886 1727204508.44920: worker is 1 (out of 1 available) 34886 1727204508.44934: exiting _queue_task() for managed-node3/service_facts 34886 1727204508.44948: done queuing things up, now waiting for results queue to drain 34886 1727204508.44950: waiting for pending results... 34886 1727204508.45142: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 34886 1727204508.45270: in run() - task 12b410aa-8751-04b9-2e74-000000000518 34886 1727204508.45284: variable 'ansible_search_path' from source: unknown 34886 1727204508.45287: variable 'ansible_search_path' from source: unknown 34886 1727204508.45322: calling self._execute() 34886 1727204508.45403: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.45407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.45421: variable 'omit' from source: magic vars 34886 1727204508.45732: variable 'ansible_distribution_major_version' from source: facts 34886 1727204508.45747: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204508.45751: variable 'omit' from source: magic vars 34886 1727204508.45812: variable 'omit' from source: magic vars 34886 1727204508.45846: variable 'omit' from source: magic vars 34886 1727204508.45882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204508.45914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204508.45934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204508.45950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204508.45966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204508.45992: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204508.45996: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.46001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.46087: Set connection var ansible_timeout to 10 34886 1727204508.46187: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204508.46192: Set connection var ansible_connection to ssh 34886 1727204508.46195: Set connection var ansible_shell_executable to /bin/sh 34886 1727204508.46197: Set connection var ansible_pipelining to False 34886 1727204508.46199: Set connection var ansible_shell_type to sh 34886 1727204508.46201: variable 'ansible_shell_executable' from source: unknown 34886 1727204508.46203: variable 'ansible_connection' from source: unknown 34886 1727204508.46206: variable 'ansible_module_compression' from source: unknown 34886 1727204508.46208: variable 'ansible_shell_type' from source: unknown 34886 1727204508.46211: variable 'ansible_shell_executable' from source: unknown 34886 1727204508.46213: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204508.46215: variable 'ansible_pipelining' from source: unknown 34886 1727204508.46217: variable 'ansible_timeout' from source: unknown 34886 1727204508.46219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204508.46319: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204508.46331: variable 'omit' from source: magic vars 34886 1727204508.46336: starting attempt loop 34886 1727204508.46339: running the handler 34886 1727204508.46355: _low_level_execute_command(): starting 34886 1727204508.46362: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204508.46913: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.46917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.46923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.46925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.46983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204508.46992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204508.46995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204508.47030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204508.48828: stdout chunk (state=3): >>>/root <<< 34886 1727204508.48937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204508.48991: stderr chunk (state=3): >>><<< 34886 1727204508.49025: stdout chunk (state=3): >>><<< 34886 1727204508.49030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204508.49033: _low_level_execute_command(): starting 34886 1727204508.49037: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522 `" && echo ansible-tmp-1727204508.4901357-36221-201437055086522="` echo /root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522 `" ) && sleep 0' 34886 1727204508.49494: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.49497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.49500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.49510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.49558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204508.49566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204508.49603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204508.51605: stdout chunk (state=3): >>>ansible-tmp-1727204508.4901357-36221-201437055086522=/root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522 <<< 34886 1727204508.51723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204508.51771: stderr chunk (state=3): >>><<< 34886 1727204508.51775: stdout chunk (state=3): >>><<< 34886 1727204508.51792: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204508.4901357-36221-201437055086522=/root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204508.51834: variable 'ansible_module_compression' from source: unknown 34886 1727204508.51873: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 34886 1727204508.51913: variable 'ansible_facts' from source: unknown 34886 1727204508.51974: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/AnsiballZ_service_facts.py 34886 1727204508.52092: Sending initial data 34886 1727204508.52095: Sent initial data (162 bytes) 34886 1727204508.52554: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204508.52558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.52560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.52563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.52624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204508.52627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204508.52660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204508.54301: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204508.54335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204508.54369: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpxtld47r2 /root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/AnsiballZ_service_facts.py <<< 34886 1727204508.54372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/AnsiballZ_service_facts.py" <<< 34886 1727204508.54407: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpxtld47r2" to remote "/root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/AnsiballZ_service_facts.py" <<< 34886 1727204508.54410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/AnsiballZ_service_facts.py" <<< 34886 1727204508.55197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204508.55255: stderr chunk (state=3): >>><<< 34886 1727204508.55259: stdout chunk (state=3): >>><<< 34886 1727204508.55279: done transferring module to remote 34886 1727204508.55291: _low_level_execute_command(): starting 34886 1727204508.55296: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/ /root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/AnsiballZ_service_facts.py && sleep 0' 34886 1727204508.55738: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.55742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.55746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.55748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.55804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204508.55809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204508.55847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204508.57695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204508.57740: stderr chunk (state=3): >>><<< 34886 1727204508.57743: stdout chunk (state=3): >>><<< 34886 1727204508.57756: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204508.57760: _low_level_execute_command(): starting 34886 1727204508.57765: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/AnsiballZ_service_facts.py && sleep 0' 34886 1727204508.58195: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.58198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.58201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204508.58203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204508.58263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204508.58266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204508.58306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204510.48881: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service",<<< 34886 1727204510.48927: stdout chunk (state=3): >>> "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 34886 1727204510.50554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204510.50608: stderr chunk (state=3): >>><<< 34886 1727204510.50615: stdout chunk (state=3): >>><<< 34886 1727204510.50639: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204510.51304: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204510.51315: _low_level_execute_command(): starting 34886 1727204510.51362: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204508.4901357-36221-201437055086522/ > /dev/null 2>&1 && sleep 0' 34886 1727204510.52000: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204510.52003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204510.52006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204510.52011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204510.52014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204510.52016: stderr chunk (state=3): >>>debug2: match not found <<< 34886 1727204510.52021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.52024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34886 1727204510.52026: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204510.52029: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34886 1727204510.52034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204510.52045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204510.52058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204510.52066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204510.52074: stderr chunk (state=3): >>>debug2: match found <<< 34886 1727204510.52084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.52165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204510.52195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204510.52199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204510.52261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204510.54182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204510.54235: stderr chunk (state=3): >>><<< 34886 1727204510.54239: stdout chunk (state=3): >>><<< 34886 1727204510.54251: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204510.54258: handler run complete 34886 1727204510.54423: variable 'ansible_facts' from source: unknown 34886 1727204510.54561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204510.54999: variable 'ansible_facts' from source: unknown 34886 1727204510.55194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204510.55317: attempt loop complete, returning result 34886 1727204510.55324: _execute() done 34886 1727204510.55327: dumping result to json 34886 1727204510.55378: done dumping result, returning 34886 1727204510.55397: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-04b9-2e74-000000000518] 34886 1727204510.55444: sending task result for task 12b410aa-8751-04b9-2e74-000000000518 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204510.56723: no more pending results, returning what we have 34886 1727204510.56726: results queue empty 34886 1727204510.56727: checking for any_errors_fatal 34886 1727204510.56731: done checking for any_errors_fatal 34886 1727204510.56731: checking for max_fail_percentage 34886 1727204510.56733: done checking for max_fail_percentage 34886 1727204510.56733: checking to see if all hosts have failed and the running result is not ok 34886 1727204510.56734: done checking to see if all hosts have failed 34886 1727204510.56734: getting the remaining hosts for this loop 34886 1727204510.56735: done getting the remaining hosts for this loop 34886 1727204510.56739: getting the next task for host managed-node3 34886 1727204510.56743: done getting next task for host managed-node3 34886 1727204510.56745: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 34886 1727204510.56748: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204510.56758: done sending task result for task 12b410aa-8751-04b9-2e74-000000000518 34886 1727204510.56761: WORKER PROCESS EXITING 34886 1727204510.56768: getting variables 34886 1727204510.56769: in VariableManager get_vars() 34886 1727204510.56802: Calling all_inventory to load vars for managed-node3 34886 1727204510.56805: Calling groups_inventory to load vars for managed-node3 34886 1727204510.56807: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204510.56815: Calling all_plugins_play to load vars for managed-node3 34886 1727204510.56818: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204510.56826: Calling groups_plugins_play to load vars for managed-node3 34886 1727204510.58088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204510.59684: done with get_vars() 34886 1727204510.59708: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:01:50 -0400 (0:00:02.151) 0:00:28.765 ***** 34886 1727204510.59796: entering _queue_task() for managed-node3/package_facts 34886 1727204510.60046: worker is 1 (out of 1 available) 34886 1727204510.60061: exiting _queue_task() for managed-node3/package_facts 34886 1727204510.60074: done queuing things up, now waiting for results queue to drain 34886 1727204510.60076: waiting for pending results... 34886 1727204510.60262: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 34886 1727204510.60384: in run() - task 12b410aa-8751-04b9-2e74-000000000519 34886 1727204510.60398: variable 'ansible_search_path' from source: unknown 34886 1727204510.60402: variable 'ansible_search_path' from source: unknown 34886 1727204510.60438: calling self._execute() 34886 1727204510.60527: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204510.60533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204510.60545: variable 'omit' from source: magic vars 34886 1727204510.60859: variable 'ansible_distribution_major_version' from source: facts 34886 1727204510.60872: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204510.60879: variable 'omit' from source: magic vars 34886 1727204510.60958: variable 'omit' from source: magic vars 34886 1727204510.60992: variable 'omit' from source: magic vars 34886 1727204510.61029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204510.61061: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204510.61081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204510.61101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204510.61112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204510.61141: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204510.61145: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204510.61148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204510.61240: Set connection var ansible_timeout to 10 34886 1727204510.61245: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204510.61249: Set connection var ansible_connection to ssh 34886 1727204510.61256: Set connection var ansible_shell_executable to /bin/sh 34886 1727204510.61264: Set connection var ansible_pipelining to False 34886 1727204510.61267: Set connection var ansible_shell_type to sh 34886 1727204510.61291: variable 'ansible_shell_executable' from source: unknown 34886 1727204510.61294: variable 'ansible_connection' from source: unknown 34886 1727204510.61300: variable 'ansible_module_compression' from source: unknown 34886 1727204510.61303: variable 'ansible_shell_type' from source: unknown 34886 1727204510.61306: variable 'ansible_shell_executable' from source: unknown 34886 1727204510.61308: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204510.61311: variable 'ansible_pipelining' from source: unknown 34886 1727204510.61322: variable 'ansible_timeout' from source: unknown 34886 1727204510.61325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204510.61487: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204510.61502: variable 'omit' from source: magic vars 34886 1727204510.61508: starting attempt loop 34886 1727204510.61511: running the handler 34886 1727204510.61527: _low_level_execute_command(): starting 34886 1727204510.61540: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204510.62073: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204510.62080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204510.62084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.62143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204510.62147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204510.62153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204510.62197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204510.63913: stdout chunk (state=3): >>>/root <<< 34886 1727204510.64040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204510.64082: stderr chunk (state=3): >>><<< 34886 1727204510.64085: stdout chunk (state=3): >>><<< 34886 1727204510.64104: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204510.64115: _low_level_execute_command(): starting 34886 1727204510.64125: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579 `" && echo ansible-tmp-1727204510.6410365-36251-107283683537579="` echo /root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579 `" ) && sleep 0' 34886 1727204510.64581: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204510.64585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.64588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204510.64599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.64648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204510.64655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204510.64694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204510.66672: stdout chunk (state=3): >>>ansible-tmp-1727204510.6410365-36251-107283683537579=/root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579 <<< 34886 1727204510.66784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204510.66834: stderr chunk (state=3): >>><<< 34886 1727204510.66838: stdout chunk (state=3): >>><<< 34886 1727204510.66854: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204510.6410365-36251-107283683537579=/root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204510.66898: variable 'ansible_module_compression' from source: unknown 34886 1727204510.66940: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 34886 1727204510.67095: variable 'ansible_facts' from source: unknown 34886 1727204510.67138: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/AnsiballZ_package_facts.py 34886 1727204510.67260: Sending initial data 34886 1727204510.67264: Sent initial data (162 bytes) 34886 1727204510.67727: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204510.67730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.67733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204510.67735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.67790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204510.67795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204510.67833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204510.69424: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204510.69433: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204510.69460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204510.69497: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp5zt78bz2 /root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/AnsiballZ_package_facts.py <<< 34886 1727204510.69510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/AnsiballZ_package_facts.py" <<< 34886 1727204510.69534: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp5zt78bz2" to remote "/root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/AnsiballZ_package_facts.py" <<< 34886 1727204510.71186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204510.71254: stderr chunk (state=3): >>><<< 34886 1727204510.71258: stdout chunk (state=3): >>><<< 34886 1727204510.71277: done transferring module to remote 34886 1727204510.71287: _low_level_execute_command(): starting 34886 1727204510.71295: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/ /root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/AnsiballZ_package_facts.py && sleep 0' 34886 1727204510.71762: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204510.71766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204510.71768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204510.71771: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204510.71774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.71825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204510.71835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204510.71870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204510.73679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204510.73729: stderr chunk (state=3): >>><<< 34886 1727204510.73733: stdout chunk (state=3): >>><<< 34886 1727204510.73746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204510.73749: _low_level_execute_command(): starting 34886 1727204510.73757: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/AnsiballZ_package_facts.py && sleep 0' 34886 1727204510.74183: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204510.74187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.74191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204510.74194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204510.74248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204510.74254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204510.74296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204511.42573: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 34886 1727204511.42581: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 34886 1727204511.42603: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 34886 1727204511.42658: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 34886 1727204511.42672: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 34886 1727204511.42676: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 34886 1727204511.42691: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 34886 1727204511.42697: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 34886 1727204511.42737: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 34886 1727204511.42743: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 34886 1727204511.42778: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 34886 1727204511.42792: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 34886 1727204511.42815: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 34886 1727204511.42824: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 34886 1727204511.44714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204511.44777: stderr chunk (state=3): >>><<< 34886 1727204511.44781: stdout chunk (state=3): >>><<< 34886 1727204511.44824: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204511.47076: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204511.47094: _low_level_execute_command(): starting 34886 1727204511.47100: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204510.6410365-36251-107283683537579/ > /dev/null 2>&1 && sleep 0' 34886 1727204511.47592: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204511.47596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204511.47599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204511.47601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204511.47603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204511.47655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204511.47658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204511.47706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204511.49635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204511.49699: stderr chunk (state=3): >>><<< 34886 1727204511.49703: stdout chunk (state=3): >>><<< 34886 1727204511.49721: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204511.49728: handler run complete 34886 1727204511.50550: variable 'ansible_facts' from source: unknown 34886 1727204511.51014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.53037: variable 'ansible_facts' from source: unknown 34886 1727204511.53465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.54239: attempt loop complete, returning result 34886 1727204511.54255: _execute() done 34886 1727204511.54258: dumping result to json 34886 1727204511.54438: done dumping result, returning 34886 1727204511.54447: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-04b9-2e74-000000000519] 34886 1727204511.54454: sending task result for task 12b410aa-8751-04b9-2e74-000000000519 34886 1727204511.56469: done sending task result for task 12b410aa-8751-04b9-2e74-000000000519 34886 1727204511.56472: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204511.56584: no more pending results, returning what we have 34886 1727204511.56586: results queue empty 34886 1727204511.56587: checking for any_errors_fatal 34886 1727204511.56593: done checking for any_errors_fatal 34886 1727204511.56594: checking for max_fail_percentage 34886 1727204511.56595: done checking for max_fail_percentage 34886 1727204511.56596: checking to see if all hosts have failed and the running result is not ok 34886 1727204511.56597: done checking to see if all hosts have failed 34886 1727204511.56597: getting the remaining hosts for this loop 34886 1727204511.56598: done getting the remaining hosts for this loop 34886 1727204511.56601: getting the next task for host managed-node3 34886 1727204511.56606: done getting next task for host managed-node3 34886 1727204511.56609: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34886 1727204511.56611: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204511.56620: getting variables 34886 1727204511.56622: in VariableManager get_vars() 34886 1727204511.56652: Calling all_inventory to load vars for managed-node3 34886 1727204511.56655: Calling groups_inventory to load vars for managed-node3 34886 1727204511.56657: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204511.56665: Calling all_plugins_play to load vars for managed-node3 34886 1727204511.56667: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204511.56669: Calling groups_plugins_play to load vars for managed-node3 34886 1727204511.57856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.59451: done with get_vars() 34886 1727204511.59478: done getting variables 34886 1727204511.59532: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.997) 0:00:29.763 ***** 34886 1727204511.59562: entering _queue_task() for managed-node3/debug 34886 1727204511.59838: worker is 1 (out of 1 available) 34886 1727204511.59853: exiting _queue_task() for managed-node3/debug 34886 1727204511.59867: done queuing things up, now waiting for results queue to drain 34886 1727204511.59869: waiting for pending results... 34886 1727204511.60060: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 34886 1727204511.60171: in run() - task 12b410aa-8751-04b9-2e74-00000000006f 34886 1727204511.60185: variable 'ansible_search_path' from source: unknown 34886 1727204511.60188: variable 'ansible_search_path' from source: unknown 34886 1727204511.60297: calling self._execute() 34886 1727204511.60317: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.60326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.60333: variable 'omit' from source: magic vars 34886 1727204511.60665: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.60676: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204511.60682: variable 'omit' from source: magic vars 34886 1727204511.60735: variable 'omit' from source: magic vars 34886 1727204511.60825: variable 'network_provider' from source: set_fact 34886 1727204511.60840: variable 'omit' from source: magic vars 34886 1727204511.60880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204511.60913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204511.60932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204511.60949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204511.60960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204511.60992: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204511.60996: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.61001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.61087: Set connection var ansible_timeout to 10 34886 1727204511.61095: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204511.61100: Set connection var ansible_connection to ssh 34886 1727204511.61105: Set connection var ansible_shell_executable to /bin/sh 34886 1727204511.61114: Set connection var ansible_pipelining to False 34886 1727204511.61117: Set connection var ansible_shell_type to sh 34886 1727204511.61140: variable 'ansible_shell_executable' from source: unknown 34886 1727204511.61144: variable 'ansible_connection' from source: unknown 34886 1727204511.61147: variable 'ansible_module_compression' from source: unknown 34886 1727204511.61150: variable 'ansible_shell_type' from source: unknown 34886 1727204511.61155: variable 'ansible_shell_executable' from source: unknown 34886 1727204511.61159: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.61164: variable 'ansible_pipelining' from source: unknown 34886 1727204511.61167: variable 'ansible_timeout' from source: unknown 34886 1727204511.61173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.61294: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204511.61308: variable 'omit' from source: magic vars 34886 1727204511.61312: starting attempt loop 34886 1727204511.61315: running the handler 34886 1727204511.61357: handler run complete 34886 1727204511.61371: attempt loop complete, returning result 34886 1727204511.61374: _execute() done 34886 1727204511.61377: dumping result to json 34886 1727204511.61380: done dumping result, returning 34886 1727204511.61388: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-04b9-2e74-00000000006f] 34886 1727204511.61397: sending task result for task 12b410aa-8751-04b9-2e74-00000000006f 34886 1727204511.61487: done sending task result for task 12b410aa-8751-04b9-2e74-00000000006f 34886 1727204511.61493: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 34886 1727204511.61587: no more pending results, returning what we have 34886 1727204511.61596: results queue empty 34886 1727204511.61598: checking for any_errors_fatal 34886 1727204511.61605: done checking for any_errors_fatal 34886 1727204511.61606: checking for max_fail_percentage 34886 1727204511.61608: done checking for max_fail_percentage 34886 1727204511.61609: checking to see if all hosts have failed and the running result is not ok 34886 1727204511.61610: done checking to see if all hosts have failed 34886 1727204511.61611: getting the remaining hosts for this loop 34886 1727204511.61613: done getting the remaining hosts for this loop 34886 1727204511.61617: getting the next task for host managed-node3 34886 1727204511.61627: done getting next task for host managed-node3 34886 1727204511.61630: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34886 1727204511.61634: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204511.61646: getting variables 34886 1727204511.61647: in VariableManager get_vars() 34886 1727204511.61685: Calling all_inventory to load vars for managed-node3 34886 1727204511.61688: Calling groups_inventory to load vars for managed-node3 34886 1727204511.61696: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204511.61709: Calling all_plugins_play to load vars for managed-node3 34886 1727204511.61713: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204511.61717: Calling groups_plugins_play to load vars for managed-node3 34886 1727204511.62930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.64604: done with get_vars() 34886 1727204511.64631: done getting variables 34886 1727204511.64677: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.051) 0:00:29.814 ***** 34886 1727204511.64708: entering _queue_task() for managed-node3/fail 34886 1727204511.64957: worker is 1 (out of 1 available) 34886 1727204511.64971: exiting _queue_task() for managed-node3/fail 34886 1727204511.64984: done queuing things up, now waiting for results queue to drain 34886 1727204511.64986: waiting for pending results... 34886 1727204511.65186: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34886 1727204511.65295: in run() - task 12b410aa-8751-04b9-2e74-000000000070 34886 1727204511.65309: variable 'ansible_search_path' from source: unknown 34886 1727204511.65313: variable 'ansible_search_path' from source: unknown 34886 1727204511.65350: calling self._execute() 34886 1727204511.65440: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.65444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.65454: variable 'omit' from source: magic vars 34886 1727204511.65779: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.65791: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204511.65897: variable 'network_state' from source: role '' defaults 34886 1727204511.65906: Evaluated conditional (network_state != {}): False 34886 1727204511.65910: when evaluation is False, skipping this task 34886 1727204511.65913: _execute() done 34886 1727204511.65916: dumping result to json 34886 1727204511.65921: done dumping result, returning 34886 1727204511.65932: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-04b9-2e74-000000000070] 34886 1727204511.65938: sending task result for task 12b410aa-8751-04b9-2e74-000000000070 34886 1727204511.66033: done sending task result for task 12b410aa-8751-04b9-2e74-000000000070 34886 1727204511.66036: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204511.66086: no more pending results, returning what we have 34886 1727204511.66092: results queue empty 34886 1727204511.66093: checking for any_errors_fatal 34886 1727204511.66101: done checking for any_errors_fatal 34886 1727204511.66102: checking for max_fail_percentage 34886 1727204511.66104: done checking for max_fail_percentage 34886 1727204511.66105: checking to see if all hosts have failed and the running result is not ok 34886 1727204511.66106: done checking to see if all hosts have failed 34886 1727204511.66107: getting the remaining hosts for this loop 34886 1727204511.66109: done getting the remaining hosts for this loop 34886 1727204511.66113: getting the next task for host managed-node3 34886 1727204511.66120: done getting next task for host managed-node3 34886 1727204511.66124: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34886 1727204511.66128: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204511.66145: getting variables 34886 1727204511.66147: in VariableManager get_vars() 34886 1727204511.66184: Calling all_inventory to load vars for managed-node3 34886 1727204511.66187: Calling groups_inventory to load vars for managed-node3 34886 1727204511.66197: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204511.66208: Calling all_plugins_play to load vars for managed-node3 34886 1727204511.66211: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204511.66215: Calling groups_plugins_play to load vars for managed-node3 34886 1727204511.67384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.68953: done with get_vars() 34886 1727204511.68974: done getting variables 34886 1727204511.69023: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.043) 0:00:29.858 ***** 34886 1727204511.69050: entering _queue_task() for managed-node3/fail 34886 1727204511.69264: worker is 1 (out of 1 available) 34886 1727204511.69279: exiting _queue_task() for managed-node3/fail 34886 1727204511.69294: done queuing things up, now waiting for results queue to drain 34886 1727204511.69297: waiting for pending results... 34886 1727204511.69485: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34886 1727204511.69584: in run() - task 12b410aa-8751-04b9-2e74-000000000071 34886 1727204511.69598: variable 'ansible_search_path' from source: unknown 34886 1727204511.69601: variable 'ansible_search_path' from source: unknown 34886 1727204511.69638: calling self._execute() 34886 1727204511.69716: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.69726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.69740: variable 'omit' from source: magic vars 34886 1727204511.70051: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.70066: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204511.70167: variable 'network_state' from source: role '' defaults 34886 1727204511.70180: Evaluated conditional (network_state != {}): False 34886 1727204511.70183: when evaluation is False, skipping this task 34886 1727204511.70186: _execute() done 34886 1727204511.70192: dumping result to json 34886 1727204511.70198: done dumping result, returning 34886 1727204511.70202: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-04b9-2e74-000000000071] 34886 1727204511.70208: sending task result for task 12b410aa-8751-04b9-2e74-000000000071 34886 1727204511.70305: done sending task result for task 12b410aa-8751-04b9-2e74-000000000071 34886 1727204511.70309: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204511.70362: no more pending results, returning what we have 34886 1727204511.70366: results queue empty 34886 1727204511.70367: checking for any_errors_fatal 34886 1727204511.70374: done checking for any_errors_fatal 34886 1727204511.70375: checking for max_fail_percentage 34886 1727204511.70376: done checking for max_fail_percentage 34886 1727204511.70377: checking to see if all hosts have failed and the running result is not ok 34886 1727204511.70378: done checking to see if all hosts have failed 34886 1727204511.70379: getting the remaining hosts for this loop 34886 1727204511.70380: done getting the remaining hosts for this loop 34886 1727204511.70384: getting the next task for host managed-node3 34886 1727204511.70391: done getting next task for host managed-node3 34886 1727204511.70395: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34886 1727204511.70399: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204511.70417: getting variables 34886 1727204511.70420: in VariableManager get_vars() 34886 1727204511.70458: Calling all_inventory to load vars for managed-node3 34886 1727204511.70461: Calling groups_inventory to load vars for managed-node3 34886 1727204511.70464: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204511.70473: Calling all_plugins_play to load vars for managed-node3 34886 1727204511.70475: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204511.70477: Calling groups_plugins_play to load vars for managed-node3 34886 1727204511.71757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.73301: done with get_vars() 34886 1727204511.73322: done getting variables 34886 1727204511.73368: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.043) 0:00:29.901 ***** 34886 1727204511.73395: entering _queue_task() for managed-node3/fail 34886 1727204511.73607: worker is 1 (out of 1 available) 34886 1727204511.73621: exiting _queue_task() for managed-node3/fail 34886 1727204511.73633: done queuing things up, now waiting for results queue to drain 34886 1727204511.73635: waiting for pending results... 34886 1727204511.73819: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34886 1727204511.73933: in run() - task 12b410aa-8751-04b9-2e74-000000000072 34886 1727204511.73945: variable 'ansible_search_path' from source: unknown 34886 1727204511.73948: variable 'ansible_search_path' from source: unknown 34886 1727204511.73982: calling self._execute() 34886 1727204511.74068: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.74077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.74091: variable 'omit' from source: magic vars 34886 1727204511.74393: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.74404: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204511.74560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204511.76313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204511.76369: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204511.76403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204511.76437: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204511.76460: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204511.76534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.76568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.76592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.76632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.76645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.76729: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.76743: Evaluated conditional (ansible_distribution_major_version | int > 9): True 34886 1727204511.76841: variable 'ansible_distribution' from source: facts 34886 1727204511.76845: variable '__network_rh_distros' from source: role '' defaults 34886 1727204511.76855: Evaluated conditional (ansible_distribution in __network_rh_distros): False 34886 1727204511.76858: when evaluation is False, skipping this task 34886 1727204511.76861: _execute() done 34886 1727204511.76866: dumping result to json 34886 1727204511.76869: done dumping result, returning 34886 1727204511.76878: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-04b9-2e74-000000000072] 34886 1727204511.76885: sending task result for task 12b410aa-8751-04b9-2e74-000000000072 34886 1727204511.76974: done sending task result for task 12b410aa-8751-04b9-2e74-000000000072 34886 1727204511.76977: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 34886 1727204511.77028: no more pending results, returning what we have 34886 1727204511.77032: results queue empty 34886 1727204511.77034: checking for any_errors_fatal 34886 1727204511.77041: done checking for any_errors_fatal 34886 1727204511.77042: checking for max_fail_percentage 34886 1727204511.77044: done checking for max_fail_percentage 34886 1727204511.77045: checking to see if all hosts have failed and the running result is not ok 34886 1727204511.77046: done checking to see if all hosts have failed 34886 1727204511.77047: getting the remaining hosts for this loop 34886 1727204511.77048: done getting the remaining hosts for this loop 34886 1727204511.77053: getting the next task for host managed-node3 34886 1727204511.77059: done getting next task for host managed-node3 34886 1727204511.77063: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34886 1727204511.77067: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204511.77086: getting variables 34886 1727204511.77088: in VariableManager get_vars() 34886 1727204511.77138: Calling all_inventory to load vars for managed-node3 34886 1727204511.77141: Calling groups_inventory to load vars for managed-node3 34886 1727204511.77144: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204511.77154: Calling all_plugins_play to load vars for managed-node3 34886 1727204511.77158: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204511.77161: Calling groups_plugins_play to load vars for managed-node3 34886 1727204511.78366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.80049: done with get_vars() 34886 1727204511.80072: done getting variables 34886 1727204511.80124: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.067) 0:00:29.969 ***** 34886 1727204511.80153: entering _queue_task() for managed-node3/dnf 34886 1727204511.80421: worker is 1 (out of 1 available) 34886 1727204511.80436: exiting _queue_task() for managed-node3/dnf 34886 1727204511.80451: done queuing things up, now waiting for results queue to drain 34886 1727204511.80453: waiting for pending results... 34886 1727204511.80653: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34886 1727204511.80764: in run() - task 12b410aa-8751-04b9-2e74-000000000073 34886 1727204511.80776: variable 'ansible_search_path' from source: unknown 34886 1727204511.80780: variable 'ansible_search_path' from source: unknown 34886 1727204511.80899: calling self._execute() 34886 1727204511.80911: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.80924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.80931: variable 'omit' from source: magic vars 34886 1727204511.81256: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.81267: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204511.81447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204511.83217: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204511.83268: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204511.83304: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204511.83335: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204511.83358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204511.83440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.83463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.83484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.83524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.83537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.83633: variable 'ansible_distribution' from source: facts 34886 1727204511.83639: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.83647: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 34886 1727204511.83740: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204511.83855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.83878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.83900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.83933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.83947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.83987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.84008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.84028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.84061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.84074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.84113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.84133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.84153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.84189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.84203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.84334: variable 'network_connections' from source: task vars 34886 1727204511.84345: variable 'interface' from source: play vars 34886 1727204511.84401: variable 'interface' from source: play vars 34886 1727204511.84461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204511.84597: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204511.84632: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204511.84659: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204511.84683: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204511.84750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204511.84755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204511.84766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.84788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204511.84835: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204511.85062: variable 'network_connections' from source: task vars 34886 1727204511.85067: variable 'interface' from source: play vars 34886 1727204511.85122: variable 'interface' from source: play vars 34886 1727204511.85147: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34886 1727204511.85152: when evaluation is False, skipping this task 34886 1727204511.85155: _execute() done 34886 1727204511.85158: dumping result to json 34886 1727204511.85165: done dumping result, returning 34886 1727204511.85173: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-04b9-2e74-000000000073] 34886 1727204511.85180: sending task result for task 12b410aa-8751-04b9-2e74-000000000073 34886 1727204511.85277: done sending task result for task 12b410aa-8751-04b9-2e74-000000000073 34886 1727204511.85281: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34886 1727204511.85336: no more pending results, returning what we have 34886 1727204511.85341: results queue empty 34886 1727204511.85342: checking for any_errors_fatal 34886 1727204511.85351: done checking for any_errors_fatal 34886 1727204511.85352: checking for max_fail_percentage 34886 1727204511.85354: done checking for max_fail_percentage 34886 1727204511.85355: checking to see if all hosts have failed and the running result is not ok 34886 1727204511.85356: done checking to see if all hosts have failed 34886 1727204511.85357: getting the remaining hosts for this loop 34886 1727204511.85358: done getting the remaining hosts for this loop 34886 1727204511.85363: getting the next task for host managed-node3 34886 1727204511.85369: done getting next task for host managed-node3 34886 1727204511.85374: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34886 1727204511.85377: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204511.85400: getting variables 34886 1727204511.85402: in VariableManager get_vars() 34886 1727204511.85446: Calling all_inventory to load vars for managed-node3 34886 1727204511.85449: Calling groups_inventory to load vars for managed-node3 34886 1727204511.85452: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204511.85462: Calling all_plugins_play to load vars for managed-node3 34886 1727204511.85465: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204511.85468: Calling groups_plugins_play to load vars for managed-node3 34886 1727204511.86709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.88269: done with get_vars() 34886 1727204511.88294: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34886 1727204511.88359: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.082) 0:00:30.051 ***** 34886 1727204511.88385: entering _queue_task() for managed-node3/yum 34886 1727204511.88642: worker is 1 (out of 1 available) 34886 1727204511.88658: exiting _queue_task() for managed-node3/yum 34886 1727204511.88672: done queuing things up, now waiting for results queue to drain 34886 1727204511.88674: waiting for pending results... 34886 1727204511.88874: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34886 1727204511.88980: in run() - task 12b410aa-8751-04b9-2e74-000000000074 34886 1727204511.88995: variable 'ansible_search_path' from source: unknown 34886 1727204511.88999: variable 'ansible_search_path' from source: unknown 34886 1727204511.89037: calling self._execute() 34886 1727204511.89121: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.89134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.89142: variable 'omit' from source: magic vars 34886 1727204511.89464: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.89473: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204511.89631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204511.91680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204511.91735: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204511.91768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204511.91799: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204511.91823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204511.91897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.91920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.91943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.91980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.91994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.92076: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.92092: Evaluated conditional (ansible_distribution_major_version | int < 8): False 34886 1727204511.92096: when evaluation is False, skipping this task 34886 1727204511.92099: _execute() done 34886 1727204511.92105: dumping result to json 34886 1727204511.92108: done dumping result, returning 34886 1727204511.92117: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-04b9-2e74-000000000074] 34886 1727204511.92125: sending task result for task 12b410aa-8751-04b9-2e74-000000000074 34886 1727204511.92222: done sending task result for task 12b410aa-8751-04b9-2e74-000000000074 34886 1727204511.92225: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 34886 1727204511.92278: no more pending results, returning what we have 34886 1727204511.92282: results queue empty 34886 1727204511.92283: checking for any_errors_fatal 34886 1727204511.92291: done checking for any_errors_fatal 34886 1727204511.92292: checking for max_fail_percentage 34886 1727204511.92294: done checking for max_fail_percentage 34886 1727204511.92296: checking to see if all hosts have failed and the running result is not ok 34886 1727204511.92297: done checking to see if all hosts have failed 34886 1727204511.92297: getting the remaining hosts for this loop 34886 1727204511.92299: done getting the remaining hosts for this loop 34886 1727204511.92303: getting the next task for host managed-node3 34886 1727204511.92311: done getting next task for host managed-node3 34886 1727204511.92316: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34886 1727204511.92319: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204511.92338: getting variables 34886 1727204511.92340: in VariableManager get_vars() 34886 1727204511.92387: Calling all_inventory to load vars for managed-node3 34886 1727204511.92397: Calling groups_inventory to load vars for managed-node3 34886 1727204511.92401: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204511.92413: Calling all_plugins_play to load vars for managed-node3 34886 1727204511.92416: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204511.92420: Calling groups_plugins_play to load vars for managed-node3 34886 1727204511.93791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204511.95349: done with get_vars() 34886 1727204511.95373: done getting variables 34886 1727204511.95425: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:51 -0400 (0:00:00.070) 0:00:30.122 ***** 34886 1727204511.95457: entering _queue_task() for managed-node3/fail 34886 1727204511.95736: worker is 1 (out of 1 available) 34886 1727204511.95751: exiting _queue_task() for managed-node3/fail 34886 1727204511.95764: done queuing things up, now waiting for results queue to drain 34886 1727204511.95766: waiting for pending results... 34886 1727204511.95968: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34886 1727204511.96078: in run() - task 12b410aa-8751-04b9-2e74-000000000075 34886 1727204511.96093: variable 'ansible_search_path' from source: unknown 34886 1727204511.96097: variable 'ansible_search_path' from source: unknown 34886 1727204511.96136: calling self._execute() 34886 1727204511.96229: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204511.96234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204511.96245: variable 'omit' from source: magic vars 34886 1727204511.96574: variable 'ansible_distribution_major_version' from source: facts 34886 1727204511.96585: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204511.96694: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204511.96870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204511.98604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204511.98662: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204511.98695: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204511.98735: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204511.98757: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204511.98831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.98867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.98888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.98921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.98938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.98980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.99002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.99022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.99057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.99075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.99112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204511.99132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204511.99152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.99191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204511.99204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204511.99352: variable 'network_connections' from source: task vars 34886 1727204511.99363: variable 'interface' from source: play vars 34886 1727204511.99426: variable 'interface' from source: play vars 34886 1727204511.99486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204511.99625: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204511.99656: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204511.99682: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204511.99712: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204511.99751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204511.99769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204511.99791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204511.99815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204511.99859: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204512.00071: variable 'network_connections' from source: task vars 34886 1727204512.00077: variable 'interface' from source: play vars 34886 1727204512.00131: variable 'interface' from source: play vars 34886 1727204512.00157: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34886 1727204512.00161: when evaluation is False, skipping this task 34886 1727204512.00164: _execute() done 34886 1727204512.00168: dumping result to json 34886 1727204512.00170: done dumping result, returning 34886 1727204512.00179: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-04b9-2e74-000000000075] 34886 1727204512.00185: sending task result for task 12b410aa-8751-04b9-2e74-000000000075 34886 1727204512.00285: done sending task result for task 12b410aa-8751-04b9-2e74-000000000075 34886 1727204512.00288: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34886 1727204512.00346: no more pending results, returning what we have 34886 1727204512.00350: results queue empty 34886 1727204512.00351: checking for any_errors_fatal 34886 1727204512.00360: done checking for any_errors_fatal 34886 1727204512.00361: checking for max_fail_percentage 34886 1727204512.00362: done checking for max_fail_percentage 34886 1727204512.00364: checking to see if all hosts have failed and the running result is not ok 34886 1727204512.00365: done checking to see if all hosts have failed 34886 1727204512.00365: getting the remaining hosts for this loop 34886 1727204512.00367: done getting the remaining hosts for this loop 34886 1727204512.00371: getting the next task for host managed-node3 34886 1727204512.00379: done getting next task for host managed-node3 34886 1727204512.00384: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34886 1727204512.00387: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204512.00416: getting variables 34886 1727204512.00418: in VariableManager get_vars() 34886 1727204512.00464: Calling all_inventory to load vars for managed-node3 34886 1727204512.00467: Calling groups_inventory to load vars for managed-node3 34886 1727204512.00470: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204512.00480: Calling all_plugins_play to load vars for managed-node3 34886 1727204512.00484: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204512.00487: Calling groups_plugins_play to load vars for managed-node3 34886 1727204512.01754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204512.03443: done with get_vars() 34886 1727204512.03465: done getting variables 34886 1727204512.03520: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.080) 0:00:30.203 ***** 34886 1727204512.03553: entering _queue_task() for managed-node3/package 34886 1727204512.03817: worker is 1 (out of 1 available) 34886 1727204512.03836: exiting _queue_task() for managed-node3/package 34886 1727204512.03850: done queuing things up, now waiting for results queue to drain 34886 1727204512.03853: waiting for pending results... 34886 1727204512.04041: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 34886 1727204512.04156: in run() - task 12b410aa-8751-04b9-2e74-000000000076 34886 1727204512.04170: variable 'ansible_search_path' from source: unknown 34886 1727204512.04173: variable 'ansible_search_path' from source: unknown 34886 1727204512.04214: calling self._execute() 34886 1727204512.04300: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204512.04305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204512.04316: variable 'omit' from source: magic vars 34886 1727204512.04630: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.04634: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204512.04804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204512.05026: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204512.05062: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204512.05095: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204512.05147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204512.05244: variable 'network_packages' from source: role '' defaults 34886 1727204512.05338: variable '__network_provider_setup' from source: role '' defaults 34886 1727204512.05347: variable '__network_service_name_default_nm' from source: role '' defaults 34886 1727204512.05405: variable '__network_service_name_default_nm' from source: role '' defaults 34886 1727204512.05413: variable '__network_packages_default_nm' from source: role '' defaults 34886 1727204512.05465: variable '__network_packages_default_nm' from source: role '' defaults 34886 1727204512.05630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204512.07205: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204512.07261: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204512.07291: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204512.07322: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204512.07343: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204512.07417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.07441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.07465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.07501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.07514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.07554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.07575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.07601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.07634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.07646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.07840: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34886 1727204512.07936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.07957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.07977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.08011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.08027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.08101: variable 'ansible_python' from source: facts 34886 1727204512.08131: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34886 1727204512.08196: variable '__network_wpa_supplicant_required' from source: role '' defaults 34886 1727204512.08265: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34886 1727204512.08386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.08408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.08430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.08465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.08478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.08522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.08544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.08568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.08601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.08613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.08734: variable 'network_connections' from source: task vars 34886 1727204512.08741: variable 'interface' from source: play vars 34886 1727204512.08828: variable 'interface' from source: play vars 34886 1727204512.08888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204512.08914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204512.08941: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.08965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204512.09009: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204512.09243: variable 'network_connections' from source: task vars 34886 1727204512.09248: variable 'interface' from source: play vars 34886 1727204512.09334: variable 'interface' from source: play vars 34886 1727204512.09364: variable '__network_packages_default_wireless' from source: role '' defaults 34886 1727204512.09432: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204512.09685: variable 'network_connections' from source: task vars 34886 1727204512.09690: variable 'interface' from source: play vars 34886 1727204512.09745: variable 'interface' from source: play vars 34886 1727204512.09768: variable '__network_packages_default_team' from source: role '' defaults 34886 1727204512.09834: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204512.10089: variable 'network_connections' from source: task vars 34886 1727204512.10093: variable 'interface' from source: play vars 34886 1727204512.10150: variable 'interface' from source: play vars 34886 1727204512.10197: variable '__network_service_name_default_initscripts' from source: role '' defaults 34886 1727204512.10249: variable '__network_service_name_default_initscripts' from source: role '' defaults 34886 1727204512.10256: variable '__network_packages_default_initscripts' from source: role '' defaults 34886 1727204512.10308: variable '__network_packages_default_initscripts' from source: role '' defaults 34886 1727204512.10493: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34886 1727204512.10891: variable 'network_connections' from source: task vars 34886 1727204512.10896: variable 'interface' from source: play vars 34886 1727204512.10948: variable 'interface' from source: play vars 34886 1727204512.10956: variable 'ansible_distribution' from source: facts 34886 1727204512.10959: variable '__network_rh_distros' from source: role '' defaults 34886 1727204512.10968: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.10982: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34886 1727204512.11124: variable 'ansible_distribution' from source: facts 34886 1727204512.11128: variable '__network_rh_distros' from source: role '' defaults 34886 1727204512.11131: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.11139: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34886 1727204512.11279: variable 'ansible_distribution' from source: facts 34886 1727204512.11283: variable '__network_rh_distros' from source: role '' defaults 34886 1727204512.11292: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.11325: variable 'network_provider' from source: set_fact 34886 1727204512.11337: variable 'ansible_facts' from source: unknown 34886 1727204512.12032: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 34886 1727204512.12036: when evaluation is False, skipping this task 34886 1727204512.12039: _execute() done 34886 1727204512.12045: dumping result to json 34886 1727204512.12047: done dumping result, returning 34886 1727204512.12060: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-04b9-2e74-000000000076] 34886 1727204512.12064: sending task result for task 12b410aa-8751-04b9-2e74-000000000076 34886 1727204512.12165: done sending task result for task 12b410aa-8751-04b9-2e74-000000000076 34886 1727204512.12170: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 34886 1727204512.12232: no more pending results, returning what we have 34886 1727204512.12236: results queue empty 34886 1727204512.12238: checking for any_errors_fatal 34886 1727204512.12246: done checking for any_errors_fatal 34886 1727204512.12247: checking for max_fail_percentage 34886 1727204512.12248: done checking for max_fail_percentage 34886 1727204512.12249: checking to see if all hosts have failed and the running result is not ok 34886 1727204512.12250: done checking to see if all hosts have failed 34886 1727204512.12251: getting the remaining hosts for this loop 34886 1727204512.12253: done getting the remaining hosts for this loop 34886 1727204512.12258: getting the next task for host managed-node3 34886 1727204512.12265: done getting next task for host managed-node3 34886 1727204512.12269: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34886 1727204512.12273: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204512.12301: getting variables 34886 1727204512.12303: in VariableManager get_vars() 34886 1727204512.12354: Calling all_inventory to load vars for managed-node3 34886 1727204512.12357: Calling groups_inventory to load vars for managed-node3 34886 1727204512.12360: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204512.12371: Calling all_plugins_play to load vars for managed-node3 34886 1727204512.12375: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204512.12379: Calling groups_plugins_play to load vars for managed-node3 34886 1727204512.13650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204512.15244: done with get_vars() 34886 1727204512.15273: done getting variables 34886 1727204512.15330: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.118) 0:00:30.321 ***** 34886 1727204512.15361: entering _queue_task() for managed-node3/package 34886 1727204512.15651: worker is 1 (out of 1 available) 34886 1727204512.15666: exiting _queue_task() for managed-node3/package 34886 1727204512.15680: done queuing things up, now waiting for results queue to drain 34886 1727204512.15683: waiting for pending results... 34886 1727204512.15884: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34886 1727204512.16037: in run() - task 12b410aa-8751-04b9-2e74-000000000077 34886 1727204512.16043: variable 'ansible_search_path' from source: unknown 34886 1727204512.16047: variable 'ansible_search_path' from source: unknown 34886 1727204512.16066: calling self._execute() 34886 1727204512.16160: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204512.16164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204512.16179: variable 'omit' from source: magic vars 34886 1727204512.16594: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.16596: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204512.16639: variable 'network_state' from source: role '' defaults 34886 1727204512.16649: Evaluated conditional (network_state != {}): False 34886 1727204512.16652: when evaluation is False, skipping this task 34886 1727204512.16655: _execute() done 34886 1727204512.16661: dumping result to json 34886 1727204512.16664: done dumping result, returning 34886 1727204512.16673: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-04b9-2e74-000000000077] 34886 1727204512.16680: sending task result for task 12b410aa-8751-04b9-2e74-000000000077 34886 1727204512.16781: done sending task result for task 12b410aa-8751-04b9-2e74-000000000077 34886 1727204512.16784: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204512.16836: no more pending results, returning what we have 34886 1727204512.16841: results queue empty 34886 1727204512.16842: checking for any_errors_fatal 34886 1727204512.16848: done checking for any_errors_fatal 34886 1727204512.16849: checking for max_fail_percentage 34886 1727204512.16851: done checking for max_fail_percentage 34886 1727204512.16852: checking to see if all hosts have failed and the running result is not ok 34886 1727204512.16853: done checking to see if all hosts have failed 34886 1727204512.16854: getting the remaining hosts for this loop 34886 1727204512.16855: done getting the remaining hosts for this loop 34886 1727204512.16860: getting the next task for host managed-node3 34886 1727204512.16866: done getting next task for host managed-node3 34886 1727204512.16870: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34886 1727204512.16874: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204512.16898: getting variables 34886 1727204512.16899: in VariableManager get_vars() 34886 1727204512.16943: Calling all_inventory to load vars for managed-node3 34886 1727204512.16946: Calling groups_inventory to load vars for managed-node3 34886 1727204512.16949: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204512.16960: Calling all_plugins_play to load vars for managed-node3 34886 1727204512.16964: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204512.16967: Calling groups_plugins_play to load vars for managed-node3 34886 1727204512.22239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204512.23795: done with get_vars() 34886 1727204512.23818: done getting variables 34886 1727204512.23861: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.085) 0:00:30.406 ***** 34886 1727204512.23886: entering _queue_task() for managed-node3/package 34886 1727204512.24168: worker is 1 (out of 1 available) 34886 1727204512.24183: exiting _queue_task() for managed-node3/package 34886 1727204512.24200: done queuing things up, now waiting for results queue to drain 34886 1727204512.24202: waiting for pending results... 34886 1727204512.24400: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34886 1727204512.24514: in run() - task 12b410aa-8751-04b9-2e74-000000000078 34886 1727204512.24533: variable 'ansible_search_path' from source: unknown 34886 1727204512.24537: variable 'ansible_search_path' from source: unknown 34886 1727204512.24571: calling self._execute() 34886 1727204512.24663: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204512.24696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204512.24700: variable 'omit' from source: magic vars 34886 1727204512.25022: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.25030: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204512.25146: variable 'network_state' from source: role '' defaults 34886 1727204512.25156: Evaluated conditional (network_state != {}): False 34886 1727204512.25160: when evaluation is False, skipping this task 34886 1727204512.25165: _execute() done 34886 1727204512.25168: dumping result to json 34886 1727204512.25171: done dumping result, returning 34886 1727204512.25180: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-04b9-2e74-000000000078] 34886 1727204512.25186: sending task result for task 12b410aa-8751-04b9-2e74-000000000078 34886 1727204512.25287: done sending task result for task 12b410aa-8751-04b9-2e74-000000000078 34886 1727204512.25292: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204512.25359: no more pending results, returning what we have 34886 1727204512.25363: results queue empty 34886 1727204512.25365: checking for any_errors_fatal 34886 1727204512.25373: done checking for any_errors_fatal 34886 1727204512.25374: checking for max_fail_percentage 34886 1727204512.25376: done checking for max_fail_percentage 34886 1727204512.25377: checking to see if all hosts have failed and the running result is not ok 34886 1727204512.25378: done checking to see if all hosts have failed 34886 1727204512.25379: getting the remaining hosts for this loop 34886 1727204512.25381: done getting the remaining hosts for this loop 34886 1727204512.25385: getting the next task for host managed-node3 34886 1727204512.25394: done getting next task for host managed-node3 34886 1727204512.25398: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34886 1727204512.25401: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204512.25419: getting variables 34886 1727204512.25421: in VariableManager get_vars() 34886 1727204512.25462: Calling all_inventory to load vars for managed-node3 34886 1727204512.25465: Calling groups_inventory to load vars for managed-node3 34886 1727204512.25468: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204512.25477: Calling all_plugins_play to load vars for managed-node3 34886 1727204512.25480: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204512.25484: Calling groups_plugins_play to load vars for managed-node3 34886 1727204512.26671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204512.28342: done with get_vars() 34886 1727204512.28365: done getting variables 34886 1727204512.28416: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.045) 0:00:30.452 ***** 34886 1727204512.28447: entering _queue_task() for managed-node3/service 34886 1727204512.28701: worker is 1 (out of 1 available) 34886 1727204512.28716: exiting _queue_task() for managed-node3/service 34886 1727204512.28731: done queuing things up, now waiting for results queue to drain 34886 1727204512.28733: waiting for pending results... 34886 1727204512.28926: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34886 1727204512.29041: in run() - task 12b410aa-8751-04b9-2e74-000000000079 34886 1727204512.29055: variable 'ansible_search_path' from source: unknown 34886 1727204512.29059: variable 'ansible_search_path' from source: unknown 34886 1727204512.29096: calling self._execute() 34886 1727204512.29185: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204512.29195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204512.29207: variable 'omit' from source: magic vars 34886 1727204512.29529: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.29540: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204512.29648: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204512.29819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204512.31564: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204512.31630: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204512.31660: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204512.31695: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204512.31717: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204512.31785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.31815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.31838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.31870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.31882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.31931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.31950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.31973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.32006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.32022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.32059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.32079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.32101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.32139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.32151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.32297: variable 'network_connections' from source: task vars 34886 1727204512.32309: variable 'interface' from source: play vars 34886 1727204512.32370: variable 'interface' from source: play vars 34886 1727204512.32433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204512.32570: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204512.32611: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204512.32641: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204512.32666: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204512.32707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204512.32729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204512.32750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.32772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204512.32819: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204512.33028: variable 'network_connections' from source: task vars 34886 1727204512.33032: variable 'interface' from source: play vars 34886 1727204512.33083: variable 'interface' from source: play vars 34886 1727204512.33120: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34886 1727204512.33124: when evaluation is False, skipping this task 34886 1727204512.33127: _execute() done 34886 1727204512.33130: dumping result to json 34886 1727204512.33132: done dumping result, returning 34886 1727204512.33135: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-04b9-2e74-000000000079] 34886 1727204512.33137: sending task result for task 12b410aa-8751-04b9-2e74-000000000079 34886 1727204512.33233: done sending task result for task 12b410aa-8751-04b9-2e74-000000000079 34886 1727204512.33241: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34886 1727204512.33294: no more pending results, returning what we have 34886 1727204512.33298: results queue empty 34886 1727204512.33299: checking for any_errors_fatal 34886 1727204512.33307: done checking for any_errors_fatal 34886 1727204512.33307: checking for max_fail_percentage 34886 1727204512.33309: done checking for max_fail_percentage 34886 1727204512.33311: checking to see if all hosts have failed and the running result is not ok 34886 1727204512.33312: done checking to see if all hosts have failed 34886 1727204512.33312: getting the remaining hosts for this loop 34886 1727204512.33314: done getting the remaining hosts for this loop 34886 1727204512.33318: getting the next task for host managed-node3 34886 1727204512.33325: done getting next task for host managed-node3 34886 1727204512.33330: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34886 1727204512.33333: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204512.33351: getting variables 34886 1727204512.33353: in VariableManager get_vars() 34886 1727204512.33400: Calling all_inventory to load vars for managed-node3 34886 1727204512.33403: Calling groups_inventory to load vars for managed-node3 34886 1727204512.33406: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204512.33416: Calling all_plugins_play to load vars for managed-node3 34886 1727204512.33419: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204512.33423: Calling groups_plugins_play to load vars for managed-node3 34886 1727204512.34653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204512.36224: done with get_vars() 34886 1727204512.36246: done getting variables 34886 1727204512.36294: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:01:52 -0400 (0:00:00.078) 0:00:30.531 ***** 34886 1727204512.36323: entering _queue_task() for managed-node3/service 34886 1727204512.36563: worker is 1 (out of 1 available) 34886 1727204512.36579: exiting _queue_task() for managed-node3/service 34886 1727204512.36595: done queuing things up, now waiting for results queue to drain 34886 1727204512.36597: waiting for pending results... 34886 1727204512.36787: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34886 1727204512.36910: in run() - task 12b410aa-8751-04b9-2e74-00000000007a 34886 1727204512.36926: variable 'ansible_search_path' from source: unknown 34886 1727204512.36932: variable 'ansible_search_path' from source: unknown 34886 1727204512.36964: calling self._execute() 34886 1727204512.37054: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204512.37058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204512.37070: variable 'omit' from source: magic vars 34886 1727204512.37392: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.37407: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204512.37550: variable 'network_provider' from source: set_fact 34886 1727204512.37555: variable 'network_state' from source: role '' defaults 34886 1727204512.37567: Evaluated conditional (network_provider == "nm" or network_state != {}): True 34886 1727204512.37574: variable 'omit' from source: magic vars 34886 1727204512.37628: variable 'omit' from source: magic vars 34886 1727204512.37655: variable 'network_service_name' from source: role '' defaults 34886 1727204512.37715: variable 'network_service_name' from source: role '' defaults 34886 1727204512.37808: variable '__network_provider_setup' from source: role '' defaults 34886 1727204512.37813: variable '__network_service_name_default_nm' from source: role '' defaults 34886 1727204512.37872: variable '__network_service_name_default_nm' from source: role '' defaults 34886 1727204512.37880: variable '__network_packages_default_nm' from source: role '' defaults 34886 1727204512.37938: variable '__network_packages_default_nm' from source: role '' defaults 34886 1727204512.38133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204512.39811: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204512.40147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204512.40176: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204512.40207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204512.40236: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204512.40301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.40328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.40353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.40386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.40400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.40446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.40467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.40487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.40520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.40535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.40757: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34886 1727204512.40994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.40998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.41001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.41035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.41060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.41179: variable 'ansible_python' from source: facts 34886 1727204512.41218: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34886 1727204512.41331: variable '__network_wpa_supplicant_required' from source: role '' defaults 34886 1727204512.41439: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34886 1727204512.41617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.41659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.41700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.41760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.41785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.41896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204512.41907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204512.41943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.42003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204512.42031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204512.42261: variable 'network_connections' from source: task vars 34886 1727204512.42264: variable 'interface' from source: play vars 34886 1727204512.42395: variable 'interface' from source: play vars 34886 1727204512.42457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204512.42683: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204512.42767: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204512.42831: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204512.42887: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204512.42978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204512.43030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204512.43078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204512.43135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204512.43203: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204512.43697: variable 'network_connections' from source: task vars 34886 1727204512.43701: variable 'interface' from source: play vars 34886 1727204512.43710: variable 'interface' from source: play vars 34886 1727204512.43759: variable '__network_packages_default_wireless' from source: role '' defaults 34886 1727204512.43870: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204512.44267: variable 'network_connections' from source: task vars 34886 1727204512.44278: variable 'interface' from source: play vars 34886 1727204512.44374: variable 'interface' from source: play vars 34886 1727204512.44413: variable '__network_packages_default_team' from source: role '' defaults 34886 1727204512.44526: variable '__network_team_connections_defined' from source: role '' defaults 34886 1727204512.44948: variable 'network_connections' from source: task vars 34886 1727204512.44961: variable 'interface' from source: play vars 34886 1727204512.45059: variable 'interface' from source: play vars 34886 1727204512.45138: variable '__network_service_name_default_initscripts' from source: role '' defaults 34886 1727204512.45226: variable '__network_service_name_default_initscripts' from source: role '' defaults 34886 1727204512.45242: variable '__network_packages_default_initscripts' from source: role '' defaults 34886 1727204512.45396: variable '__network_packages_default_initscripts' from source: role '' defaults 34886 1727204512.45638: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34886 1727204512.46064: variable 'network_connections' from source: task vars 34886 1727204512.46068: variable 'interface' from source: play vars 34886 1727204512.46121: variable 'interface' from source: play vars 34886 1727204512.46131: variable 'ansible_distribution' from source: facts 34886 1727204512.46135: variable '__network_rh_distros' from source: role '' defaults 34886 1727204512.46142: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.46157: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34886 1727204512.46304: variable 'ansible_distribution' from source: facts 34886 1727204512.46308: variable '__network_rh_distros' from source: role '' defaults 34886 1727204512.46314: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.46321: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34886 1727204512.46466: variable 'ansible_distribution' from source: facts 34886 1727204512.46470: variable '__network_rh_distros' from source: role '' defaults 34886 1727204512.46482: variable 'ansible_distribution_major_version' from source: facts 34886 1727204512.46510: variable 'network_provider' from source: set_fact 34886 1727204512.46533: variable 'omit' from source: magic vars 34886 1727204512.46557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204512.46581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204512.46602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204512.46618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204512.46631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204512.46659: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204512.46662: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204512.46667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204512.46757: Set connection var ansible_timeout to 10 34886 1727204512.46763: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204512.46766: Set connection var ansible_connection to ssh 34886 1727204512.46773: Set connection var ansible_shell_executable to /bin/sh 34886 1727204512.46781: Set connection var ansible_pipelining to False 34886 1727204512.46784: Set connection var ansible_shell_type to sh 34886 1727204512.46812: variable 'ansible_shell_executable' from source: unknown 34886 1727204512.46815: variable 'ansible_connection' from source: unknown 34886 1727204512.46818: variable 'ansible_module_compression' from source: unknown 34886 1727204512.46820: variable 'ansible_shell_type' from source: unknown 34886 1727204512.46828: variable 'ansible_shell_executable' from source: unknown 34886 1727204512.46831: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204512.46836: variable 'ansible_pipelining' from source: unknown 34886 1727204512.46840: variable 'ansible_timeout' from source: unknown 34886 1727204512.46845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204512.46937: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204512.46946: variable 'omit' from source: magic vars 34886 1727204512.46953: starting attempt loop 34886 1727204512.46956: running the handler 34886 1727204512.47022: variable 'ansible_facts' from source: unknown 34886 1727204512.47666: _low_level_execute_command(): starting 34886 1727204512.47670: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204512.48197: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204512.48201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204512.48204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204512.48206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204512.48209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.48265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204512.48268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204512.48322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204512.50111: stdout chunk (state=3): >>>/root <<< 34886 1727204512.50227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204512.50273: stderr chunk (state=3): >>><<< 34886 1727204512.50276: stdout chunk (state=3): >>><<< 34886 1727204512.50297: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204512.50309: _low_level_execute_command(): starting 34886 1727204512.50314: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490 `" && echo ansible-tmp-1727204512.502971-36277-193526580892490="` echo /root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490 `" ) && sleep 0' 34886 1727204512.50772: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204512.50776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.50778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204512.50781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.50835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204512.50841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204512.50883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204512.52919: stdout chunk (state=3): >>>ansible-tmp-1727204512.502971-36277-193526580892490=/root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490 <<< 34886 1727204512.53040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204512.53085: stderr chunk (state=3): >>><<< 34886 1727204512.53091: stdout chunk (state=3): >>><<< 34886 1727204512.53105: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204512.502971-36277-193526580892490=/root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204512.53134: variable 'ansible_module_compression' from source: unknown 34886 1727204512.53181: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 34886 1727204512.53233: variable 'ansible_facts' from source: unknown 34886 1727204512.53377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/AnsiballZ_systemd.py 34886 1727204512.53496: Sending initial data 34886 1727204512.53499: Sent initial data (155 bytes) 34886 1727204512.53958: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204512.53961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.53963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204512.53971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.54030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204512.54033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204512.54063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204512.55877: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204512.55885: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204512.55912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204512.55953: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpieq6ga4f /root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/AnsiballZ_systemd.py <<< 34886 1727204512.55958: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/AnsiballZ_systemd.py" <<< 34886 1727204512.55985: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpieq6ga4f" to remote "/root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/AnsiballZ_systemd.py" <<< 34886 1727204512.57711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204512.57776: stderr chunk (state=3): >>><<< 34886 1727204512.57779: stdout chunk (state=3): >>><<< 34886 1727204512.57800: done transferring module to remote 34886 1727204512.57813: _low_level_execute_command(): starting 34886 1727204512.57818: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/ /root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/AnsiballZ_systemd.py && sleep 0' 34886 1727204512.58283: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204512.58287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204512.58291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204512.58293: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204512.58296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.58349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204512.58355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204512.58392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204512.60768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204512.60818: stderr chunk (state=3): >>><<< 34886 1727204512.60821: stdout chunk (state=3): >>><<< 34886 1727204512.60839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204512.60842: _low_level_execute_command(): starting 34886 1727204512.60848: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/AnsiballZ_systemd.py && sleep 0' 34886 1727204512.61303: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204512.61307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.61309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204512.61312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204512.61314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.61364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204512.61367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204512.61415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204512.94896: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11862016", "MemoryAvailable": "infinity", "CPUUsageNSec": "1715598000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 34886 1727204512.94944: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service cloud-init.service network.service shutdown.target multi-user.target", "After": "systemd-journald.socket dbus-broker.service system.slice cloud-init-local.service basic.target dbus.socket sysinit.target network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "l<<< 34886 1727204512.94952: stdout chunk (state=3): >>>oaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:14 EDT", "StateChangeTimestampMonotonic": "499035810", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 34886 1727204512.96939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204512.97000: stderr chunk (state=3): >>><<< 34886 1727204512.97009: stdout chunk (state=3): >>><<< 34886 1727204512.97028: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11862016", "MemoryAvailable": "infinity", "CPUUsageNSec": "1715598000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service cloud-init.service network.service shutdown.target multi-user.target", "After": "systemd-journald.socket dbus-broker.service system.slice cloud-init-local.service basic.target dbus.socket sysinit.target network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:14 EDT", "StateChangeTimestampMonotonic": "499035810", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204512.97201: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204512.97221: _low_level_execute_command(): starting 34886 1727204512.97284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204512.502971-36277-193526580892490/ > /dev/null 2>&1 && sleep 0' 34886 1727204512.97695: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204512.97699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204512.97704: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.97716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204512.97775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204512.97784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204512.97821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204512.99732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204512.99780: stderr chunk (state=3): >>><<< 34886 1727204512.99783: stdout chunk (state=3): >>><<< 34886 1727204512.99799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204512.99808: handler run complete 34886 1727204512.99862: attempt loop complete, returning result 34886 1727204512.99866: _execute() done 34886 1727204512.99868: dumping result to json 34886 1727204512.99888: done dumping result, returning 34886 1727204512.99899: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-04b9-2e74-00000000007a] 34886 1727204512.99906: sending task result for task 12b410aa-8751-04b9-2e74-00000000007a 34886 1727204513.00178: done sending task result for task 12b410aa-8751-04b9-2e74-00000000007a 34886 1727204513.00181: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204513.00248: no more pending results, returning what we have 34886 1727204513.00252: results queue empty 34886 1727204513.00253: checking for any_errors_fatal 34886 1727204513.00259: done checking for any_errors_fatal 34886 1727204513.00260: checking for max_fail_percentage 34886 1727204513.00262: done checking for max_fail_percentage 34886 1727204513.00263: checking to see if all hosts have failed and the running result is not ok 34886 1727204513.00264: done checking to see if all hosts have failed 34886 1727204513.00265: getting the remaining hosts for this loop 34886 1727204513.00266: done getting the remaining hosts for this loop 34886 1727204513.00270: getting the next task for host managed-node3 34886 1727204513.00276: done getting next task for host managed-node3 34886 1727204513.00281: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34886 1727204513.00284: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204513.00305: getting variables 34886 1727204513.00307: in VariableManager get_vars() 34886 1727204513.00350: Calling all_inventory to load vars for managed-node3 34886 1727204513.00354: Calling groups_inventory to load vars for managed-node3 34886 1727204513.00357: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204513.00367: Calling all_plugins_play to load vars for managed-node3 34886 1727204513.00370: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204513.00373: Calling groups_plugins_play to load vars for managed-node3 34886 1727204513.01732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204513.03314: done with get_vars() 34886 1727204513.03341: done getting variables 34886 1727204513.03393: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.670) 0:00:31.202 ***** 34886 1727204513.03424: entering _queue_task() for managed-node3/service 34886 1727204513.03706: worker is 1 (out of 1 available) 34886 1727204513.03726: exiting _queue_task() for managed-node3/service 34886 1727204513.03739: done queuing things up, now waiting for results queue to drain 34886 1727204513.03741: waiting for pending results... 34886 1727204513.03952: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34886 1727204513.04072: in run() - task 12b410aa-8751-04b9-2e74-00000000007b 34886 1727204513.04086: variable 'ansible_search_path' from source: unknown 34886 1727204513.04094: variable 'ansible_search_path' from source: unknown 34886 1727204513.04143: calling self._execute() 34886 1727204513.04213: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.04222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.04233: variable 'omit' from source: magic vars 34886 1727204513.04566: variable 'ansible_distribution_major_version' from source: facts 34886 1727204513.04577: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204513.04688: variable 'network_provider' from source: set_fact 34886 1727204513.04694: Evaluated conditional (network_provider == "nm"): True 34886 1727204513.04774: variable '__network_wpa_supplicant_required' from source: role '' defaults 34886 1727204513.04850: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34886 1727204513.05004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204513.06703: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204513.06758: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204513.06792: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204513.06823: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204513.06849: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204513.06932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204513.06956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204513.06981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204513.07017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204513.07032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204513.07072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204513.07126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204513.07130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204513.07152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204513.07165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204513.07205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204513.07229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204513.07250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204513.07280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204513.07296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204513.07417: variable 'network_connections' from source: task vars 34886 1727204513.07432: variable 'interface' from source: play vars 34886 1727204513.07486: variable 'interface' from source: play vars 34886 1727204513.07554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204513.07686: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204513.07720: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204513.07749: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204513.07781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204513.07822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34886 1727204513.07843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34886 1727204513.07867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204513.07888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34886 1727204513.07932: variable '__network_wireless_connections_defined' from source: role '' defaults 34886 1727204513.08139: variable 'network_connections' from source: task vars 34886 1727204513.08143: variable 'interface' from source: play vars 34886 1727204513.08199: variable 'interface' from source: play vars 34886 1727204513.08227: Evaluated conditional (__network_wpa_supplicant_required): False 34886 1727204513.08230: when evaluation is False, skipping this task 34886 1727204513.08233: _execute() done 34886 1727204513.08236: dumping result to json 34886 1727204513.08241: done dumping result, returning 34886 1727204513.08248: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-04b9-2e74-00000000007b] 34886 1727204513.08259: sending task result for task 12b410aa-8751-04b9-2e74-00000000007b 34886 1727204513.08357: done sending task result for task 12b410aa-8751-04b9-2e74-00000000007b 34886 1727204513.08360: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 34886 1727204513.08415: no more pending results, returning what we have 34886 1727204513.08421: results queue empty 34886 1727204513.08423: checking for any_errors_fatal 34886 1727204513.08450: done checking for any_errors_fatal 34886 1727204513.08451: checking for max_fail_percentage 34886 1727204513.08453: done checking for max_fail_percentage 34886 1727204513.08454: checking to see if all hosts have failed and the running result is not ok 34886 1727204513.08455: done checking to see if all hosts have failed 34886 1727204513.08456: getting the remaining hosts for this loop 34886 1727204513.08458: done getting the remaining hosts for this loop 34886 1727204513.08462: getting the next task for host managed-node3 34886 1727204513.08471: done getting next task for host managed-node3 34886 1727204513.08476: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34886 1727204513.08479: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204513.08499: getting variables 34886 1727204513.08501: in VariableManager get_vars() 34886 1727204513.08548: Calling all_inventory to load vars for managed-node3 34886 1727204513.08551: Calling groups_inventory to load vars for managed-node3 34886 1727204513.08554: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204513.08563: Calling all_plugins_play to load vars for managed-node3 34886 1727204513.08566: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204513.08569: Calling groups_plugins_play to load vars for managed-node3 34886 1727204513.09922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204513.11504: done with get_vars() 34886 1727204513.11530: done getting variables 34886 1727204513.11579: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.081) 0:00:31.283 ***** 34886 1727204513.11607: entering _queue_task() for managed-node3/service 34886 1727204513.11869: worker is 1 (out of 1 available) 34886 1727204513.11885: exiting _queue_task() for managed-node3/service 34886 1727204513.11901: done queuing things up, now waiting for results queue to drain 34886 1727204513.11903: waiting for pending results... 34886 1727204513.12094: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 34886 1727204513.12194: in run() - task 12b410aa-8751-04b9-2e74-00000000007c 34886 1727204513.12210: variable 'ansible_search_path' from source: unknown 34886 1727204513.12214: variable 'ansible_search_path' from source: unknown 34886 1727204513.12252: calling self._execute() 34886 1727204513.12338: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.12350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.12361: variable 'omit' from source: magic vars 34886 1727204513.12681: variable 'ansible_distribution_major_version' from source: facts 34886 1727204513.12694: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204513.12798: variable 'network_provider' from source: set_fact 34886 1727204513.12802: Evaluated conditional (network_provider == "initscripts"): False 34886 1727204513.12805: when evaluation is False, skipping this task 34886 1727204513.12808: _execute() done 34886 1727204513.12814: dumping result to json 34886 1727204513.12817: done dumping result, returning 34886 1727204513.12826: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-04b9-2e74-00000000007c] 34886 1727204513.12832: sending task result for task 12b410aa-8751-04b9-2e74-00000000007c skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34886 1727204513.12982: no more pending results, returning what we have 34886 1727204513.12987: results queue empty 34886 1727204513.12988: checking for any_errors_fatal 34886 1727204513.12999: done checking for any_errors_fatal 34886 1727204513.13000: checking for max_fail_percentage 34886 1727204513.13002: done checking for max_fail_percentage 34886 1727204513.13003: checking to see if all hosts have failed and the running result is not ok 34886 1727204513.13004: done checking to see if all hosts have failed 34886 1727204513.13005: getting the remaining hosts for this loop 34886 1727204513.13007: done getting the remaining hosts for this loop 34886 1727204513.13012: getting the next task for host managed-node3 34886 1727204513.13021: done getting next task for host managed-node3 34886 1727204513.13026: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34886 1727204513.13029: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204513.13047: getting variables 34886 1727204513.13048: in VariableManager get_vars() 34886 1727204513.13087: Calling all_inventory to load vars for managed-node3 34886 1727204513.13098: Calling groups_inventory to load vars for managed-node3 34886 1727204513.13101: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204513.13107: done sending task result for task 12b410aa-8751-04b9-2e74-00000000007c 34886 1727204513.13110: WORKER PROCESS EXITING 34886 1727204513.13121: Calling all_plugins_play to load vars for managed-node3 34886 1727204513.13125: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204513.13129: Calling groups_plugins_play to load vars for managed-node3 34886 1727204513.14306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204513.15894: done with get_vars() 34886 1727204513.15917: done getting variables 34886 1727204513.15967: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.043) 0:00:31.327 ***** 34886 1727204513.15997: entering _queue_task() for managed-node3/copy 34886 1727204513.16238: worker is 1 (out of 1 available) 34886 1727204513.16254: exiting _queue_task() for managed-node3/copy 34886 1727204513.16269: done queuing things up, now waiting for results queue to drain 34886 1727204513.16271: waiting for pending results... 34886 1727204513.16454: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34886 1727204513.16558: in run() - task 12b410aa-8751-04b9-2e74-00000000007d 34886 1727204513.16572: variable 'ansible_search_path' from source: unknown 34886 1727204513.16576: variable 'ansible_search_path' from source: unknown 34886 1727204513.16614: calling self._execute() 34886 1727204513.16697: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.16705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.16717: variable 'omit' from source: magic vars 34886 1727204513.17034: variable 'ansible_distribution_major_version' from source: facts 34886 1727204513.17046: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204513.17149: variable 'network_provider' from source: set_fact 34886 1727204513.17154: Evaluated conditional (network_provider == "initscripts"): False 34886 1727204513.17159: when evaluation is False, skipping this task 34886 1727204513.17162: _execute() done 34886 1727204513.17165: dumping result to json 34886 1727204513.17167: done dumping result, returning 34886 1727204513.17180: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-04b9-2e74-00000000007d] 34886 1727204513.17183: sending task result for task 12b410aa-8751-04b9-2e74-00000000007d 34886 1727204513.17287: done sending task result for task 12b410aa-8751-04b9-2e74-00000000007d 34886 1727204513.17292: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 34886 1727204513.17349: no more pending results, returning what we have 34886 1727204513.17353: results queue empty 34886 1727204513.17354: checking for any_errors_fatal 34886 1727204513.17359: done checking for any_errors_fatal 34886 1727204513.17360: checking for max_fail_percentage 34886 1727204513.17362: done checking for max_fail_percentage 34886 1727204513.17362: checking to see if all hosts have failed and the running result is not ok 34886 1727204513.17363: done checking to see if all hosts have failed 34886 1727204513.17364: getting the remaining hosts for this loop 34886 1727204513.17365: done getting the remaining hosts for this loop 34886 1727204513.17369: getting the next task for host managed-node3 34886 1727204513.17375: done getting next task for host managed-node3 34886 1727204513.17379: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34886 1727204513.17382: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204513.17402: getting variables 34886 1727204513.17404: in VariableManager get_vars() 34886 1727204513.17445: Calling all_inventory to load vars for managed-node3 34886 1727204513.17448: Calling groups_inventory to load vars for managed-node3 34886 1727204513.17451: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204513.17464: Calling all_plugins_play to load vars for managed-node3 34886 1727204513.17468: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204513.17471: Calling groups_plugins_play to load vars for managed-node3 34886 1727204513.18769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204513.20330: done with get_vars() 34886 1727204513.20353: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.044) 0:00:31.372 ***** 34886 1727204513.20428: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 34886 1727204513.20658: worker is 1 (out of 1 available) 34886 1727204513.20672: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 34886 1727204513.20685: done queuing things up, now waiting for results queue to drain 34886 1727204513.20688: waiting for pending results... 34886 1727204513.20896: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34886 1727204513.21005: in run() - task 12b410aa-8751-04b9-2e74-00000000007e 34886 1727204513.21021: variable 'ansible_search_path' from source: unknown 34886 1727204513.21025: variable 'ansible_search_path' from source: unknown 34886 1727204513.21061: calling self._execute() 34886 1727204513.21163: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.21170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.21181: variable 'omit' from source: magic vars 34886 1727204513.21511: variable 'ansible_distribution_major_version' from source: facts 34886 1727204513.21521: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204513.21531: variable 'omit' from source: magic vars 34886 1727204513.21588: variable 'omit' from source: magic vars 34886 1727204513.21735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204513.23496: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204513.23554: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204513.23593: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204513.23626: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204513.23654: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204513.23730: variable 'network_provider' from source: set_fact 34886 1727204513.23850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204513.23897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204513.23918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204513.23954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204513.23967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204513.24037: variable 'omit' from source: magic vars 34886 1727204513.24139: variable 'omit' from source: magic vars 34886 1727204513.24230: variable 'network_connections' from source: task vars 34886 1727204513.24242: variable 'interface' from source: play vars 34886 1727204513.24295: variable 'interface' from source: play vars 34886 1727204513.24418: variable 'omit' from source: magic vars 34886 1727204513.24426: variable '__lsr_ansible_managed' from source: task vars 34886 1727204513.24479: variable '__lsr_ansible_managed' from source: task vars 34886 1727204513.24729: Loaded config def from plugin (lookup/template) 34886 1727204513.24733: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 34886 1727204513.24765: File lookup term: get_ansible_managed.j2 34886 1727204513.24770: variable 'ansible_search_path' from source: unknown 34886 1727204513.24773: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 34886 1727204513.24786: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 34886 1727204513.24803: variable 'ansible_search_path' from source: unknown 34886 1727204513.30516: variable 'ansible_managed' from source: unknown 34886 1727204513.30664: variable 'omit' from source: magic vars 34886 1727204513.30692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204513.30723: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204513.30738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204513.30755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204513.30766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204513.30795: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204513.30799: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.30803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.30888: Set connection var ansible_timeout to 10 34886 1727204513.30894: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204513.30897: Set connection var ansible_connection to ssh 34886 1727204513.30906: Set connection var ansible_shell_executable to /bin/sh 34886 1727204513.30916: Set connection var ansible_pipelining to False 34886 1727204513.30918: Set connection var ansible_shell_type to sh 34886 1727204513.30943: variable 'ansible_shell_executable' from source: unknown 34886 1727204513.30946: variable 'ansible_connection' from source: unknown 34886 1727204513.30949: variable 'ansible_module_compression' from source: unknown 34886 1727204513.30954: variable 'ansible_shell_type' from source: unknown 34886 1727204513.30957: variable 'ansible_shell_executable' from source: unknown 34886 1727204513.30961: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.30966: variable 'ansible_pipelining' from source: unknown 34886 1727204513.30970: variable 'ansible_timeout' from source: unknown 34886 1727204513.30975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.31093: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204513.31105: variable 'omit' from source: magic vars 34886 1727204513.31111: starting attempt loop 34886 1727204513.31114: running the handler 34886 1727204513.31131: _low_level_execute_command(): starting 34886 1727204513.31138: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204513.31661: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204513.31698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204513.31702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204513.31758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204513.31761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204513.31763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204513.31816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204513.33563: stdout chunk (state=3): >>>/root <<< 34886 1727204513.33670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204513.33734: stderr chunk (state=3): >>><<< 34886 1727204513.33739: stdout chunk (state=3): >>><<< 34886 1727204513.33761: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204513.33773: _low_level_execute_command(): starting 34886 1727204513.33780: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823 `" && echo ansible-tmp-1727204513.337622-36295-62791614863823="` echo /root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823 `" ) && sleep 0' 34886 1727204513.34263: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204513.34267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204513.34269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204513.34272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204513.34276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204513.34336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204513.34340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204513.34374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204513.36361: stdout chunk (state=3): >>>ansible-tmp-1727204513.337622-36295-62791614863823=/root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823 <<< 34886 1727204513.36477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204513.36530: stderr chunk (state=3): >>><<< 34886 1727204513.36533: stdout chunk (state=3): >>><<< 34886 1727204513.36551: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204513.337622-36295-62791614863823=/root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204513.36653: variable 'ansible_module_compression' from source: unknown 34886 1727204513.36656: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 34886 1727204513.36664: variable 'ansible_facts' from source: unknown 34886 1727204513.36734: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/AnsiballZ_network_connections.py 34886 1727204513.36851: Sending initial data 34886 1727204513.36855: Sent initial data (166 bytes) 34886 1727204513.37322: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204513.37326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204513.37329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204513.37336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204513.37390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204513.37396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204513.37434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204513.39039: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204513.39046: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204513.39077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204513.39111: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp9h7aetbp /root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/AnsiballZ_network_connections.py <<< 34886 1727204513.39120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/AnsiballZ_network_connections.py" <<< 34886 1727204513.39144: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp9h7aetbp" to remote "/root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/AnsiballZ_network_connections.py" <<< 34886 1727204513.39153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/AnsiballZ_network_connections.py" <<< 34886 1727204513.40245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204513.40310: stderr chunk (state=3): >>><<< 34886 1727204513.40314: stdout chunk (state=3): >>><<< 34886 1727204513.40337: done transferring module to remote 34886 1727204513.40348: _low_level_execute_command(): starting 34886 1727204513.40354: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/ /root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/AnsiballZ_network_connections.py && sleep 0' 34886 1727204513.40814: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204513.40817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204513.40822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204513.40825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204513.40877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204513.40881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204513.40920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204513.42749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204513.42801: stderr chunk (state=3): >>><<< 34886 1727204513.42804: stdout chunk (state=3): >>><<< 34886 1727204513.42821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204513.42825: _low_level_execute_command(): starting 34886 1727204513.42827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/AnsiballZ_network_connections.py && sleep 0' 34886 1727204513.43281: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204513.43285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204513.43288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204513.43293: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204513.43295: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204513.43348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204513.43351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204513.43397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204513.81425: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 34886 1727204513.81451: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_69psghe5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_69psghe5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 34886 1727204513.81463: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/29a45e80-a1b1-4083-9f57-453b97dfb981: error=unknown <<< 34886 1727204513.81630: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 34886 1727204513.83666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204513.83714: stderr chunk (state=3): >>><<< 34886 1727204513.83719: stdout chunk (state=3): >>><<< 34886 1727204513.83740: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_69psghe5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_69psghe5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/29a45e80-a1b1-4083-9f57-453b97dfb981: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204513.83778: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204513.83788: _low_level_execute_command(): starting 34886 1727204513.83796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204513.337622-36295-62791614863823/ > /dev/null 2>&1 && sleep 0' 34886 1727204513.84285: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204513.84288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204513.84296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204513.84299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204513.84349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204513.84353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204513.84401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204513.86279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204513.86333: stderr chunk (state=3): >>><<< 34886 1727204513.86336: stdout chunk (state=3): >>><<< 34886 1727204513.86349: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204513.86357: handler run complete 34886 1727204513.86383: attempt loop complete, returning result 34886 1727204513.86386: _execute() done 34886 1727204513.86388: dumping result to json 34886 1727204513.86397: done dumping result, returning 34886 1727204513.86407: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-04b9-2e74-00000000007e] 34886 1727204513.86412: sending task result for task 12b410aa-8751-04b9-2e74-00000000007e 34886 1727204513.86526: done sending task result for task 12b410aa-8751-04b9-2e74-00000000007e 34886 1727204513.86529: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 34886 1727204513.86659: no more pending results, returning what we have 34886 1727204513.86662: results queue empty 34886 1727204513.86663: checking for any_errors_fatal 34886 1727204513.86671: done checking for any_errors_fatal 34886 1727204513.86672: checking for max_fail_percentage 34886 1727204513.86674: done checking for max_fail_percentage 34886 1727204513.86675: checking to see if all hosts have failed and the running result is not ok 34886 1727204513.86676: done checking to see if all hosts have failed 34886 1727204513.86676: getting the remaining hosts for this loop 34886 1727204513.86678: done getting the remaining hosts for this loop 34886 1727204513.86682: getting the next task for host managed-node3 34886 1727204513.86691: done getting next task for host managed-node3 34886 1727204513.86695: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34886 1727204513.86699: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204513.86709: getting variables 34886 1727204513.86711: in VariableManager get_vars() 34886 1727204513.86756: Calling all_inventory to load vars for managed-node3 34886 1727204513.86759: Calling groups_inventory to load vars for managed-node3 34886 1727204513.86761: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204513.86772: Calling all_plugins_play to load vars for managed-node3 34886 1727204513.86775: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204513.86778: Calling groups_plugins_play to load vars for managed-node3 34886 1727204513.88179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204513.89763: done with get_vars() 34886 1727204513.89786: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.694) 0:00:32.066 ***** 34886 1727204513.89867: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 34886 1727204513.90143: worker is 1 (out of 1 available) 34886 1727204513.90157: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 34886 1727204513.90172: done queuing things up, now waiting for results queue to drain 34886 1727204513.90174: waiting for pending results... 34886 1727204513.90376: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 34886 1727204513.90488: in run() - task 12b410aa-8751-04b9-2e74-00000000007f 34886 1727204513.90505: variable 'ansible_search_path' from source: unknown 34886 1727204513.90510: variable 'ansible_search_path' from source: unknown 34886 1727204513.90545: calling self._execute() 34886 1727204513.90639: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.90645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.90657: variable 'omit' from source: magic vars 34886 1727204513.90986: variable 'ansible_distribution_major_version' from source: facts 34886 1727204513.90998: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204513.91104: variable 'network_state' from source: role '' defaults 34886 1727204513.91114: Evaluated conditional (network_state != {}): False 34886 1727204513.91118: when evaluation is False, skipping this task 34886 1727204513.91124: _execute() done 34886 1727204513.91127: dumping result to json 34886 1727204513.91130: done dumping result, returning 34886 1727204513.91138: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-04b9-2e74-00000000007f] 34886 1727204513.91143: sending task result for task 12b410aa-8751-04b9-2e74-00000000007f 34886 1727204513.91248: done sending task result for task 12b410aa-8751-04b9-2e74-00000000007f 34886 1727204513.91252: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34886 1727204513.91337: no more pending results, returning what we have 34886 1727204513.91341: results queue empty 34886 1727204513.91342: checking for any_errors_fatal 34886 1727204513.91350: done checking for any_errors_fatal 34886 1727204513.91351: checking for max_fail_percentage 34886 1727204513.91352: done checking for max_fail_percentage 34886 1727204513.91353: checking to see if all hosts have failed and the running result is not ok 34886 1727204513.91354: done checking to see if all hosts have failed 34886 1727204513.91355: getting the remaining hosts for this loop 34886 1727204513.91356: done getting the remaining hosts for this loop 34886 1727204513.91360: getting the next task for host managed-node3 34886 1727204513.91368: done getting next task for host managed-node3 34886 1727204513.91373: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34886 1727204513.91376: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204513.91396: getting variables 34886 1727204513.91399: in VariableManager get_vars() 34886 1727204513.91442: Calling all_inventory to load vars for managed-node3 34886 1727204513.91445: Calling groups_inventory to load vars for managed-node3 34886 1727204513.91447: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204513.91457: Calling all_plugins_play to load vars for managed-node3 34886 1727204513.91460: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204513.91464: Calling groups_plugins_play to load vars for managed-node3 34886 1727204513.92654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204513.94252: done with get_vars() 34886 1727204513.94275: done getting variables 34886 1727204513.94331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.044) 0:00:32.111 ***** 34886 1727204513.94361: entering _queue_task() for managed-node3/debug 34886 1727204513.94631: worker is 1 (out of 1 available) 34886 1727204513.94646: exiting _queue_task() for managed-node3/debug 34886 1727204513.94659: done queuing things up, now waiting for results queue to drain 34886 1727204513.94661: waiting for pending results... 34886 1727204513.94855: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34886 1727204513.94965: in run() - task 12b410aa-8751-04b9-2e74-000000000080 34886 1727204513.94980: variable 'ansible_search_path' from source: unknown 34886 1727204513.94984: variable 'ansible_search_path' from source: unknown 34886 1727204513.95022: calling self._execute() 34886 1727204513.95110: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.95116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.95127: variable 'omit' from source: magic vars 34886 1727204513.95447: variable 'ansible_distribution_major_version' from source: facts 34886 1727204513.95459: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204513.95465: variable 'omit' from source: magic vars 34886 1727204513.95515: variable 'omit' from source: magic vars 34886 1727204513.95549: variable 'omit' from source: magic vars 34886 1727204513.95587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204513.95622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204513.95638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204513.95656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204513.95668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204513.95698: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204513.95702: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.95705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.95793: Set connection var ansible_timeout to 10 34886 1727204513.95801: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204513.95804: Set connection var ansible_connection to ssh 34886 1727204513.95811: Set connection var ansible_shell_executable to /bin/sh 34886 1727204513.95822: Set connection var ansible_pipelining to False 34886 1727204513.95825: Set connection var ansible_shell_type to sh 34886 1727204513.95845: variable 'ansible_shell_executable' from source: unknown 34886 1727204513.95848: variable 'ansible_connection' from source: unknown 34886 1727204513.95851: variable 'ansible_module_compression' from source: unknown 34886 1727204513.95856: variable 'ansible_shell_type' from source: unknown 34886 1727204513.95858: variable 'ansible_shell_executable' from source: unknown 34886 1727204513.95863: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204513.95867: variable 'ansible_pipelining' from source: unknown 34886 1727204513.95875: variable 'ansible_timeout' from source: unknown 34886 1727204513.95878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204513.96003: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204513.96015: variable 'omit' from source: magic vars 34886 1727204513.96023: starting attempt loop 34886 1727204513.96027: running the handler 34886 1727204513.96130: variable '__network_connections_result' from source: set_fact 34886 1727204513.96174: handler run complete 34886 1727204513.96192: attempt loop complete, returning result 34886 1727204513.96195: _execute() done 34886 1727204513.96198: dumping result to json 34886 1727204513.96208: done dumping result, returning 34886 1727204513.96212: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-04b9-2e74-000000000080] 34886 1727204513.96221: sending task result for task 12b410aa-8751-04b9-2e74-000000000080 34886 1727204513.96314: done sending task result for task 12b410aa-8751-04b9-2e74-000000000080 34886 1727204513.96317: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 34886 1727204513.96395: no more pending results, returning what we have 34886 1727204513.96399: results queue empty 34886 1727204513.96401: checking for any_errors_fatal 34886 1727204513.96408: done checking for any_errors_fatal 34886 1727204513.96409: checking for max_fail_percentage 34886 1727204513.96411: done checking for max_fail_percentage 34886 1727204513.96412: checking to see if all hosts have failed and the running result is not ok 34886 1727204513.96413: done checking to see if all hosts have failed 34886 1727204513.96414: getting the remaining hosts for this loop 34886 1727204513.96415: done getting the remaining hosts for this loop 34886 1727204513.96422: getting the next task for host managed-node3 34886 1727204513.96430: done getting next task for host managed-node3 34886 1727204513.96434: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34886 1727204513.96437: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204513.96448: getting variables 34886 1727204513.96449: in VariableManager get_vars() 34886 1727204513.96485: Calling all_inventory to load vars for managed-node3 34886 1727204513.96488: Calling groups_inventory to load vars for managed-node3 34886 1727204513.96499: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204513.96508: Calling all_plugins_play to load vars for managed-node3 34886 1727204513.96511: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204513.96514: Calling groups_plugins_play to load vars for managed-node3 34886 1727204513.97851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204513.99418: done with get_vars() 34886 1727204513.99446: done getting variables 34886 1727204513.99495: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:01:53 -0400 (0:00:00.051) 0:00:32.163 ***** 34886 1727204513.99524: entering _queue_task() for managed-node3/debug 34886 1727204513.99777: worker is 1 (out of 1 available) 34886 1727204513.99794: exiting _queue_task() for managed-node3/debug 34886 1727204513.99808: done queuing things up, now waiting for results queue to drain 34886 1727204513.99810: waiting for pending results... 34886 1727204514.00008: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34886 1727204514.00114: in run() - task 12b410aa-8751-04b9-2e74-000000000081 34886 1727204514.00135: variable 'ansible_search_path' from source: unknown 34886 1727204514.00141: variable 'ansible_search_path' from source: unknown 34886 1727204514.00172: calling self._execute() 34886 1727204514.00263: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.00271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.00280: variable 'omit' from source: magic vars 34886 1727204514.00601: variable 'ansible_distribution_major_version' from source: facts 34886 1727204514.00612: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204514.00621: variable 'omit' from source: magic vars 34886 1727204514.00677: variable 'omit' from source: magic vars 34886 1727204514.00710: variable 'omit' from source: magic vars 34886 1727204514.00747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204514.00781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204514.00802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204514.00818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204514.00830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204514.00858: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204514.00862: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.00865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.00954: Set connection var ansible_timeout to 10 34886 1727204514.00960: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204514.00963: Set connection var ansible_connection to ssh 34886 1727204514.00970: Set connection var ansible_shell_executable to /bin/sh 34886 1727204514.00978: Set connection var ansible_pipelining to False 34886 1727204514.00981: Set connection var ansible_shell_type to sh 34886 1727204514.01009: variable 'ansible_shell_executable' from source: unknown 34886 1727204514.01013: variable 'ansible_connection' from source: unknown 34886 1727204514.01017: variable 'ansible_module_compression' from source: unknown 34886 1727204514.01022: variable 'ansible_shell_type' from source: unknown 34886 1727204514.01024: variable 'ansible_shell_executable' from source: unknown 34886 1727204514.01027: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.01029: variable 'ansible_pipelining' from source: unknown 34886 1727204514.01034: variable 'ansible_timeout' from source: unknown 34886 1727204514.01040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.01165: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204514.01177: variable 'omit' from source: magic vars 34886 1727204514.01183: starting attempt loop 34886 1727204514.01186: running the handler 34886 1727204514.01234: variable '__network_connections_result' from source: set_fact 34886 1727204514.01300: variable '__network_connections_result' from source: set_fact 34886 1727204514.01393: handler run complete 34886 1727204514.01415: attempt loop complete, returning result 34886 1727204514.01421: _execute() done 34886 1727204514.01424: dumping result to json 34886 1727204514.01427: done dumping result, returning 34886 1727204514.01436: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-04b9-2e74-000000000081] 34886 1727204514.01447: sending task result for task 12b410aa-8751-04b9-2e74-000000000081 34886 1727204514.01542: done sending task result for task 12b410aa-8751-04b9-2e74-000000000081 34886 1727204514.01545: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 34886 1727204514.01670: no more pending results, returning what we have 34886 1727204514.01674: results queue empty 34886 1727204514.01675: checking for any_errors_fatal 34886 1727204514.01681: done checking for any_errors_fatal 34886 1727204514.01681: checking for max_fail_percentage 34886 1727204514.01683: done checking for max_fail_percentage 34886 1727204514.01684: checking to see if all hosts have failed and the running result is not ok 34886 1727204514.01685: done checking to see if all hosts have failed 34886 1727204514.01686: getting the remaining hosts for this loop 34886 1727204514.01688: done getting the remaining hosts for this loop 34886 1727204514.01693: getting the next task for host managed-node3 34886 1727204514.01699: done getting next task for host managed-node3 34886 1727204514.01703: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34886 1727204514.01706: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204514.01717: getting variables 34886 1727204514.01718: in VariableManager get_vars() 34886 1727204514.01760: Calling all_inventory to load vars for managed-node3 34886 1727204514.01764: Calling groups_inventory to load vars for managed-node3 34886 1727204514.01766: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204514.01780: Calling all_plugins_play to load vars for managed-node3 34886 1727204514.01784: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204514.01787: Calling groups_plugins_play to load vars for managed-node3 34886 1727204514.03493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204514.05225: done with get_vars() 34886 1727204514.05254: done getting variables 34886 1727204514.05307: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.058) 0:00:32.221 ***** 34886 1727204514.05343: entering _queue_task() for managed-node3/debug 34886 1727204514.05617: worker is 1 (out of 1 available) 34886 1727204514.05636: exiting _queue_task() for managed-node3/debug 34886 1727204514.05650: done queuing things up, now waiting for results queue to drain 34886 1727204514.05652: waiting for pending results... 34886 1727204514.05941: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34886 1727204514.06123: in run() - task 12b410aa-8751-04b9-2e74-000000000082 34886 1727204514.06127: variable 'ansible_search_path' from source: unknown 34886 1727204514.06131: variable 'ansible_search_path' from source: unknown 34886 1727204514.06133: calling self._execute() 34886 1727204514.06183: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.06411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.06415: variable 'omit' from source: magic vars 34886 1727204514.06799: variable 'ansible_distribution_major_version' from source: facts 34886 1727204514.06802: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204514.06805: variable 'network_state' from source: role '' defaults 34886 1727204514.06808: Evaluated conditional (network_state != {}): False 34886 1727204514.06811: when evaluation is False, skipping this task 34886 1727204514.06813: _execute() done 34886 1727204514.06815: dumping result to json 34886 1727204514.06818: done dumping result, returning 34886 1727204514.06823: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-04b9-2e74-000000000082] 34886 1727204514.06825: sending task result for task 12b410aa-8751-04b9-2e74-000000000082 34886 1727204514.06925: done sending task result for task 12b410aa-8751-04b9-2e74-000000000082 34886 1727204514.06928: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 34886 1727204514.06978: no more pending results, returning what we have 34886 1727204514.06983: results queue empty 34886 1727204514.06984: checking for any_errors_fatal 34886 1727204514.06996: done checking for any_errors_fatal 34886 1727204514.06997: checking for max_fail_percentage 34886 1727204514.06998: done checking for max_fail_percentage 34886 1727204514.07000: checking to see if all hosts have failed and the running result is not ok 34886 1727204514.07001: done checking to see if all hosts have failed 34886 1727204514.07001: getting the remaining hosts for this loop 34886 1727204514.07003: done getting the remaining hosts for this loop 34886 1727204514.07008: getting the next task for host managed-node3 34886 1727204514.07018: done getting next task for host managed-node3 34886 1727204514.07025: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34886 1727204514.07028: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204514.07046: getting variables 34886 1727204514.07047: in VariableManager get_vars() 34886 1727204514.07086: Calling all_inventory to load vars for managed-node3 34886 1727204514.07196: Calling groups_inventory to load vars for managed-node3 34886 1727204514.07200: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204514.07210: Calling all_plugins_play to load vars for managed-node3 34886 1727204514.07214: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204514.07217: Calling groups_plugins_play to load vars for managed-node3 34886 1727204514.09458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204514.12639: done with get_vars() 34886 1727204514.12682: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.074) 0:00:32.295 ***** 34886 1727204514.12824: entering _queue_task() for managed-node3/ping 34886 1727204514.13230: worker is 1 (out of 1 available) 34886 1727204514.13244: exiting _queue_task() for managed-node3/ping 34886 1727204514.13265: done queuing things up, now waiting for results queue to drain 34886 1727204514.13267: waiting for pending results... 34886 1727204514.13615: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 34886 1727204514.13796: in run() - task 12b410aa-8751-04b9-2e74-000000000083 34886 1727204514.13837: variable 'ansible_search_path' from source: unknown 34886 1727204514.13923: variable 'ansible_search_path' from source: unknown 34886 1727204514.13928: calling self._execute() 34886 1727204514.14078: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.14095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.14113: variable 'omit' from source: magic vars 34886 1727204514.14650: variable 'ansible_distribution_major_version' from source: facts 34886 1727204514.14670: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204514.14694: variable 'omit' from source: magic vars 34886 1727204514.14794: variable 'omit' from source: magic vars 34886 1727204514.14858: variable 'omit' from source: magic vars 34886 1727204514.14935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204514.15017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204514.15028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204514.15048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204514.15068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204514.15112: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204514.15131: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.15151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.15345: Set connection var ansible_timeout to 10 34886 1727204514.15353: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204514.15356: Set connection var ansible_connection to ssh 34886 1727204514.15359: Set connection var ansible_shell_executable to /bin/sh 34886 1727204514.15363: Set connection var ansible_pipelining to False 34886 1727204514.15365: Set connection var ansible_shell_type to sh 34886 1727204514.15400: variable 'ansible_shell_executable' from source: unknown 34886 1727204514.15411: variable 'ansible_connection' from source: unknown 34886 1727204514.15423: variable 'ansible_module_compression' from source: unknown 34886 1727204514.15453: variable 'ansible_shell_type' from source: unknown 34886 1727204514.15457: variable 'ansible_shell_executable' from source: unknown 34886 1727204514.15464: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.15467: variable 'ansible_pipelining' from source: unknown 34886 1727204514.15469: variable 'ansible_timeout' from source: unknown 34886 1727204514.15563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.15767: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34886 1727204514.15800: variable 'omit' from source: magic vars 34886 1727204514.15816: starting attempt loop 34886 1727204514.15828: running the handler 34886 1727204514.15849: _low_level_execute_command(): starting 34886 1727204514.15862: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204514.16709: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204514.16779: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204514.16839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204514.16859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204514.16902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204514.16991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204514.18750: stdout chunk (state=3): >>>/root <<< 34886 1727204514.18972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204514.18975: stdout chunk (state=3): >>><<< 34886 1727204514.18978: stderr chunk (state=3): >>><<< 34886 1727204514.19002: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204514.19026: _low_level_execute_command(): starting 34886 1727204514.19131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034 `" && echo ansible-tmp-1727204514.1900969-36315-186405463717034="` echo /root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034 `" ) && sleep 0' 34886 1727204514.19983: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204514.19987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204514.19991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204514.20001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204514.20004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204514.20046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204514.20070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204514.20142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204514.22152: stdout chunk (state=3): >>>ansible-tmp-1727204514.1900969-36315-186405463717034=/root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034 <<< 34886 1727204514.22346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204514.22365: stderr chunk (state=3): >>><<< 34886 1727204514.22374: stdout chunk (state=3): >>><<< 34886 1727204514.22404: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204514.1900969-36315-186405463717034=/root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204514.22469: variable 'ansible_module_compression' from source: unknown 34886 1727204514.22520: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 34886 1727204514.22575: variable 'ansible_facts' from source: unknown 34886 1727204514.22695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/AnsiballZ_ping.py 34886 1727204514.22945: Sending initial data 34886 1727204514.22948: Sent initial data (153 bytes) 34886 1727204514.23537: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34886 1727204514.23608: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204514.23654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204514.23671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204514.23696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204514.23769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204514.25704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204514.25712: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204514.25720: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmptrbkskg9 /root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/AnsiballZ_ping.py <<< 34886 1727204514.25724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/AnsiballZ_ping.py" <<< 34886 1727204514.25726: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmptrbkskg9" to remote "/root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/AnsiballZ_ping.py" <<< 34886 1727204514.26477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204514.26595: stderr chunk (state=3): >>><<< 34886 1727204514.26610: stdout chunk (state=3): >>><<< 34886 1727204514.26640: done transferring module to remote 34886 1727204514.26657: _low_level_execute_command(): starting 34886 1727204514.26666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/ /root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/AnsiballZ_ping.py && sleep 0' 34886 1727204514.27300: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204514.27317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204514.27335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204514.27370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204514.27475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 34886 1727204514.27498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204514.27521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204514.27593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204514.29527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204514.29575: stdout chunk (state=3): >>><<< 34886 1727204514.29578: stderr chunk (state=3): >>><<< 34886 1727204514.29599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204514.29609: _low_level_execute_command(): starting 34886 1727204514.29703: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/AnsiballZ_ping.py && sleep 0' 34886 1727204514.30359: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204514.30375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204514.30420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204514.30452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204514.30544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204514.30578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204514.30669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204514.47981: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 34886 1727204514.49744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204514.49750: stdout chunk (state=3): >>><<< 34886 1727204514.49752: stderr chunk (state=3): >>><<< 34886 1727204514.49755: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204514.49758: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204514.49761: _low_level_execute_command(): starting 34886 1727204514.49763: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204514.1900969-36315-186405463717034/ > /dev/null 2>&1 && sleep 0' 34886 1727204514.51241: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204514.51250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204514.51261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204514.51281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204514.51296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204514.51428: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204514.51607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204514.51707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204514.53652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204514.53827: stderr chunk (state=3): >>><<< 34886 1727204514.53831: stdout chunk (state=3): >>><<< 34886 1727204514.53852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204514.53861: handler run complete 34886 1727204514.53897: attempt loop complete, returning result 34886 1727204514.53900: _execute() done 34886 1727204514.53903: dumping result to json 34886 1727204514.53905: done dumping result, returning 34886 1727204514.53908: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-04b9-2e74-000000000083] 34886 1727204514.53910: sending task result for task 12b410aa-8751-04b9-2e74-000000000083 34886 1727204514.54071: done sending task result for task 12b410aa-8751-04b9-2e74-000000000083 34886 1727204514.54074: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 34886 1727204514.54174: no more pending results, returning what we have 34886 1727204514.54178: results queue empty 34886 1727204514.54179: checking for any_errors_fatal 34886 1727204514.54186: done checking for any_errors_fatal 34886 1727204514.54187: checking for max_fail_percentage 34886 1727204514.54191: done checking for max_fail_percentage 34886 1727204514.54192: checking to see if all hosts have failed and the running result is not ok 34886 1727204514.54193: done checking to see if all hosts have failed 34886 1727204514.54194: getting the remaining hosts for this loop 34886 1727204514.54196: done getting the remaining hosts for this loop 34886 1727204514.54201: getting the next task for host managed-node3 34886 1727204514.54213: done getting next task for host managed-node3 34886 1727204514.54216: ^ task is: TASK: meta (role_complete) 34886 1727204514.54220: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204514.54234: getting variables 34886 1727204514.54236: in VariableManager get_vars() 34886 1727204514.54286: Calling all_inventory to load vars for managed-node3 34886 1727204514.54509: Calling groups_inventory to load vars for managed-node3 34886 1727204514.54514: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204514.54524: Calling all_plugins_play to load vars for managed-node3 34886 1727204514.54528: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204514.54531: Calling groups_plugins_play to load vars for managed-node3 34886 1727204514.57022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204514.60326: done with get_vars() 34886 1727204514.60373: done getting variables 34886 1727204514.60491: done queuing things up, now waiting for results queue to drain 34886 1727204514.60494: results queue empty 34886 1727204514.60495: checking for any_errors_fatal 34886 1727204514.60499: done checking for any_errors_fatal 34886 1727204514.60500: checking for max_fail_percentage 34886 1727204514.60502: done checking for max_fail_percentage 34886 1727204514.60503: checking to see if all hosts have failed and the running result is not ok 34886 1727204514.60504: done checking to see if all hosts have failed 34886 1727204514.60505: getting the remaining hosts for this loop 34886 1727204514.60506: done getting the remaining hosts for this loop 34886 1727204514.60510: getting the next task for host managed-node3 34886 1727204514.60515: done getting next task for host managed-node3 34886 1727204514.60519: ^ task is: TASK: Include the task 'manage_test_interface.yml' 34886 1727204514.60521: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204514.60524: getting variables 34886 1727204514.60525: in VariableManager get_vars() 34886 1727204514.60553: Calling all_inventory to load vars for managed-node3 34886 1727204514.60556: Calling groups_inventory to load vars for managed-node3 34886 1727204514.60559: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204514.60566: Calling all_plugins_play to load vars for managed-node3 34886 1727204514.60569: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204514.60573: Calling groups_plugins_play to load vars for managed-node3 34886 1727204514.62818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204514.66051: done with get_vars() 34886 1727204514.66107: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.533) 0:00:32.829 ***** 34886 1727204514.66221: entering _queue_task() for managed-node3/include_tasks 34886 1727204514.66686: worker is 1 (out of 1 available) 34886 1727204514.66706: exiting _queue_task() for managed-node3/include_tasks 34886 1727204514.66722: done queuing things up, now waiting for results queue to drain 34886 1727204514.66724: waiting for pending results... 34886 1727204514.67211: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 34886 1727204514.67242: in run() - task 12b410aa-8751-04b9-2e74-0000000000b3 34886 1727204514.67259: variable 'ansible_search_path' from source: unknown 34886 1727204514.67307: calling self._execute() 34886 1727204514.67427: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.67595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.67600: variable 'omit' from source: magic vars 34886 1727204514.67947: variable 'ansible_distribution_major_version' from source: facts 34886 1727204514.67962: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204514.67969: _execute() done 34886 1727204514.67973: dumping result to json 34886 1727204514.67978: done dumping result, returning 34886 1727204514.67985: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [12b410aa-8751-04b9-2e74-0000000000b3] 34886 1727204514.68007: sending task result for task 12b410aa-8751-04b9-2e74-0000000000b3 34886 1727204514.68125: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000b3 34886 1727204514.68129: WORKER PROCESS EXITING 34886 1727204514.68161: no more pending results, returning what we have 34886 1727204514.68169: in VariableManager get_vars() 34886 1727204514.68346: Calling all_inventory to load vars for managed-node3 34886 1727204514.68351: Calling groups_inventory to load vars for managed-node3 34886 1727204514.68354: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204514.68369: Calling all_plugins_play to load vars for managed-node3 34886 1727204514.68373: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204514.68376: Calling groups_plugins_play to load vars for managed-node3 34886 1727204514.70708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204514.74020: done with get_vars() 34886 1727204514.74064: variable 'ansible_search_path' from source: unknown 34886 1727204514.74083: we have included files to process 34886 1727204514.74084: generating all_blocks data 34886 1727204514.74087: done generating all_blocks data 34886 1727204514.74095: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34886 1727204514.74097: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34886 1727204514.74104: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34886 1727204514.74733: in VariableManager get_vars() 34886 1727204514.74777: done with get_vars() 34886 1727204514.75681: done processing included file 34886 1727204514.75683: iterating over new_blocks loaded from include file 34886 1727204514.75685: in VariableManager get_vars() 34886 1727204514.75716: done with get_vars() 34886 1727204514.75718: filtering new block on tags 34886 1727204514.75767: done filtering new block on tags 34886 1727204514.75771: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 34886 1727204514.75777: extending task lists for all hosts with included blocks 34886 1727204514.79172: done extending task lists 34886 1727204514.79174: done processing included files 34886 1727204514.79175: results queue empty 34886 1727204514.79176: checking for any_errors_fatal 34886 1727204514.79178: done checking for any_errors_fatal 34886 1727204514.79179: checking for max_fail_percentage 34886 1727204514.79180: done checking for max_fail_percentage 34886 1727204514.79181: checking to see if all hosts have failed and the running result is not ok 34886 1727204514.79183: done checking to see if all hosts have failed 34886 1727204514.79183: getting the remaining hosts for this loop 34886 1727204514.79185: done getting the remaining hosts for this loop 34886 1727204514.79188: getting the next task for host managed-node3 34886 1727204514.79194: done getting next task for host managed-node3 34886 1727204514.79197: ^ task is: TASK: Ensure state in ["present", "absent"] 34886 1727204514.79206: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204514.79210: getting variables 34886 1727204514.79211: in VariableManager get_vars() 34886 1727204514.79232: Calling all_inventory to load vars for managed-node3 34886 1727204514.79235: Calling groups_inventory to load vars for managed-node3 34886 1727204514.79238: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204514.79246: Calling all_plugins_play to load vars for managed-node3 34886 1727204514.79249: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204514.79253: Calling groups_plugins_play to load vars for managed-node3 34886 1727204514.81546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204514.84669: done with get_vars() 34886 1727204514.84702: done getting variables 34886 1727204514.84752: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.185) 0:00:33.015 ***** 34886 1727204514.84779: entering _queue_task() for managed-node3/fail 34886 1727204514.85095: worker is 1 (out of 1 available) 34886 1727204514.85111: exiting _queue_task() for managed-node3/fail 34886 1727204514.85127: done queuing things up, now waiting for results queue to drain 34886 1727204514.85129: waiting for pending results... 34886 1727204514.85328: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 34886 1727204514.85422: in run() - task 12b410aa-8751-04b9-2e74-0000000005cc 34886 1727204514.85437: variable 'ansible_search_path' from source: unknown 34886 1727204514.85441: variable 'ansible_search_path' from source: unknown 34886 1727204514.85477: calling self._execute() 34886 1727204514.85562: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.85570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.85583: variable 'omit' from source: magic vars 34886 1727204514.85912: variable 'ansible_distribution_major_version' from source: facts 34886 1727204514.85919: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204514.86038: variable 'state' from source: include params 34886 1727204514.86044: Evaluated conditional (state not in ["present", "absent"]): False 34886 1727204514.86047: when evaluation is False, skipping this task 34886 1727204514.86050: _execute() done 34886 1727204514.86056: dumping result to json 34886 1727204514.86059: done dumping result, returning 34886 1727204514.86066: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [12b410aa-8751-04b9-2e74-0000000005cc] 34886 1727204514.86073: sending task result for task 12b410aa-8751-04b9-2e74-0000000005cc 34886 1727204514.86167: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005cc 34886 1727204514.86170: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 34886 1727204514.86226: no more pending results, returning what we have 34886 1727204514.86232: results queue empty 34886 1727204514.86233: checking for any_errors_fatal 34886 1727204514.86236: done checking for any_errors_fatal 34886 1727204514.86237: checking for max_fail_percentage 34886 1727204514.86238: done checking for max_fail_percentage 34886 1727204514.86239: checking to see if all hosts have failed and the running result is not ok 34886 1727204514.86240: done checking to see if all hosts have failed 34886 1727204514.86241: getting the remaining hosts for this loop 34886 1727204514.86243: done getting the remaining hosts for this loop 34886 1727204514.86247: getting the next task for host managed-node3 34886 1727204514.86254: done getting next task for host managed-node3 34886 1727204514.86257: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 34886 1727204514.86260: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204514.86264: getting variables 34886 1727204514.86265: in VariableManager get_vars() 34886 1727204514.86316: Calling all_inventory to load vars for managed-node3 34886 1727204514.86321: Calling groups_inventory to load vars for managed-node3 34886 1727204514.86323: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204514.86335: Calling all_plugins_play to load vars for managed-node3 34886 1727204514.86339: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204514.86343: Calling groups_plugins_play to load vars for managed-node3 34886 1727204514.92759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204514.94530: done with get_vars() 34886 1727204514.94555: done getting variables 34886 1727204514.94601: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.098) 0:00:33.114 ***** 34886 1727204514.94624: entering _queue_task() for managed-node3/fail 34886 1727204514.94898: worker is 1 (out of 1 available) 34886 1727204514.94913: exiting _queue_task() for managed-node3/fail 34886 1727204514.94929: done queuing things up, now waiting for results queue to drain 34886 1727204514.94932: waiting for pending results... 34886 1727204514.95127: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 34886 1727204514.95218: in run() - task 12b410aa-8751-04b9-2e74-0000000005cd 34886 1727204514.95234: variable 'ansible_search_path' from source: unknown 34886 1727204514.95239: variable 'ansible_search_path' from source: unknown 34886 1727204514.95272: calling self._execute() 34886 1727204514.95374: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.95379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.95393: variable 'omit' from source: magic vars 34886 1727204514.95724: variable 'ansible_distribution_major_version' from source: facts 34886 1727204514.95740: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204514.95865: variable 'type' from source: play vars 34886 1727204514.95870: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 34886 1727204514.95876: when evaluation is False, skipping this task 34886 1727204514.95879: _execute() done 34886 1727204514.95881: dumping result to json 34886 1727204514.95887: done dumping result, returning 34886 1727204514.95894: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [12b410aa-8751-04b9-2e74-0000000005cd] 34886 1727204514.95901: sending task result for task 12b410aa-8751-04b9-2e74-0000000005cd 34886 1727204514.96001: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005cd 34886 1727204514.96004: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 34886 1727204514.96056: no more pending results, returning what we have 34886 1727204514.96060: results queue empty 34886 1727204514.96061: checking for any_errors_fatal 34886 1727204514.96069: done checking for any_errors_fatal 34886 1727204514.96070: checking for max_fail_percentage 34886 1727204514.96072: done checking for max_fail_percentage 34886 1727204514.96073: checking to see if all hosts have failed and the running result is not ok 34886 1727204514.96074: done checking to see if all hosts have failed 34886 1727204514.96075: getting the remaining hosts for this loop 34886 1727204514.96076: done getting the remaining hosts for this loop 34886 1727204514.96080: getting the next task for host managed-node3 34886 1727204514.96086: done getting next task for host managed-node3 34886 1727204514.96091: ^ task is: TASK: Include the task 'show_interfaces.yml' 34886 1727204514.96094: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204514.96098: getting variables 34886 1727204514.96099: in VariableManager get_vars() 34886 1727204514.96142: Calling all_inventory to load vars for managed-node3 34886 1727204514.96145: Calling groups_inventory to load vars for managed-node3 34886 1727204514.96148: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204514.96159: Calling all_plugins_play to load vars for managed-node3 34886 1727204514.96162: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204514.96166: Calling groups_plugins_play to load vars for managed-node3 34886 1727204514.97372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204514.98950: done with get_vars() 34886 1727204514.98971: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:01:54 -0400 (0:00:00.044) 0:00:33.158 ***** 34886 1727204514.99050: entering _queue_task() for managed-node3/include_tasks 34886 1727204514.99290: worker is 1 (out of 1 available) 34886 1727204514.99306: exiting _queue_task() for managed-node3/include_tasks 34886 1727204514.99322: done queuing things up, now waiting for results queue to drain 34886 1727204514.99324: waiting for pending results... 34886 1727204514.99505: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 34886 1727204514.99607: in run() - task 12b410aa-8751-04b9-2e74-0000000005ce 34886 1727204514.99619: variable 'ansible_search_path' from source: unknown 34886 1727204514.99622: variable 'ansible_search_path' from source: unknown 34886 1727204514.99666: calling self._execute() 34886 1727204514.99750: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204514.99757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204514.99769: variable 'omit' from source: magic vars 34886 1727204515.00093: variable 'ansible_distribution_major_version' from source: facts 34886 1727204515.00109: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204515.00113: _execute() done 34886 1727204515.00116: dumping result to json 34886 1727204515.00119: done dumping result, returning 34886 1727204515.00128: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-04b9-2e74-0000000005ce] 34886 1727204515.00135: sending task result for task 12b410aa-8751-04b9-2e74-0000000005ce 34886 1727204515.00230: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005ce 34886 1727204515.00233: WORKER PROCESS EXITING 34886 1727204515.00262: no more pending results, returning what we have 34886 1727204515.00268: in VariableManager get_vars() 34886 1727204515.00319: Calling all_inventory to load vars for managed-node3 34886 1727204515.00323: Calling groups_inventory to load vars for managed-node3 34886 1727204515.00326: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204515.00337: Calling all_plugins_play to load vars for managed-node3 34886 1727204515.00340: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204515.00344: Calling groups_plugins_play to load vars for managed-node3 34886 1727204515.01651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204515.03195: done with get_vars() 34886 1727204515.03214: variable 'ansible_search_path' from source: unknown 34886 1727204515.03214: variable 'ansible_search_path' from source: unknown 34886 1727204515.03246: we have included files to process 34886 1727204515.03247: generating all_blocks data 34886 1727204515.03248: done generating all_blocks data 34886 1727204515.03253: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204515.03254: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204515.03255: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34886 1727204515.03341: in VariableManager get_vars() 34886 1727204515.03361: done with get_vars() 34886 1727204515.03455: done processing included file 34886 1727204515.03457: iterating over new_blocks loaded from include file 34886 1727204515.03458: in VariableManager get_vars() 34886 1727204515.03473: done with get_vars() 34886 1727204515.03474: filtering new block on tags 34886 1727204515.03491: done filtering new block on tags 34886 1727204515.03493: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 34886 1727204515.03496: extending task lists for all hosts with included blocks 34886 1727204515.03826: done extending task lists 34886 1727204515.03827: done processing included files 34886 1727204515.03828: results queue empty 34886 1727204515.03828: checking for any_errors_fatal 34886 1727204515.03831: done checking for any_errors_fatal 34886 1727204515.03831: checking for max_fail_percentage 34886 1727204515.03832: done checking for max_fail_percentage 34886 1727204515.03833: checking to see if all hosts have failed and the running result is not ok 34886 1727204515.03834: done checking to see if all hosts have failed 34886 1727204515.03834: getting the remaining hosts for this loop 34886 1727204515.03835: done getting the remaining hosts for this loop 34886 1727204515.03837: getting the next task for host managed-node3 34886 1727204515.03840: done getting next task for host managed-node3 34886 1727204515.03842: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 34886 1727204515.03844: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204515.03846: getting variables 34886 1727204515.03847: in VariableManager get_vars() 34886 1727204515.03858: Calling all_inventory to load vars for managed-node3 34886 1727204515.03860: Calling groups_inventory to load vars for managed-node3 34886 1727204515.03861: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204515.03866: Calling all_plugins_play to load vars for managed-node3 34886 1727204515.03868: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204515.03870: Calling groups_plugins_play to load vars for managed-node3 34886 1727204515.04975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204515.06512: done with get_vars() 34886 1727204515.06533: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.075) 0:00:33.233 ***** 34886 1727204515.06593: entering _queue_task() for managed-node3/include_tasks 34886 1727204515.06846: worker is 1 (out of 1 available) 34886 1727204515.06861: exiting _queue_task() for managed-node3/include_tasks 34886 1727204515.06873: done queuing things up, now waiting for results queue to drain 34886 1727204515.06875: waiting for pending results... 34886 1727204515.07076: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 34886 1727204515.07162: in run() - task 12b410aa-8751-04b9-2e74-0000000006e4 34886 1727204515.07175: variable 'ansible_search_path' from source: unknown 34886 1727204515.07178: variable 'ansible_search_path' from source: unknown 34886 1727204515.07224: calling self._execute() 34886 1727204515.07307: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.07313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.07325: variable 'omit' from source: magic vars 34886 1727204515.07654: variable 'ansible_distribution_major_version' from source: facts 34886 1727204515.07669: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204515.07673: _execute() done 34886 1727204515.07676: dumping result to json 34886 1727204515.07679: done dumping result, returning 34886 1727204515.07687: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-04b9-2e74-0000000006e4] 34886 1727204515.07696: sending task result for task 12b410aa-8751-04b9-2e74-0000000006e4 34886 1727204515.07792: done sending task result for task 12b410aa-8751-04b9-2e74-0000000006e4 34886 1727204515.07795: WORKER PROCESS EXITING 34886 1727204515.07827: no more pending results, returning what we have 34886 1727204515.07832: in VariableManager get_vars() 34886 1727204515.07882: Calling all_inventory to load vars for managed-node3 34886 1727204515.07886: Calling groups_inventory to load vars for managed-node3 34886 1727204515.07888: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204515.07902: Calling all_plugins_play to load vars for managed-node3 34886 1727204515.07905: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204515.07909: Calling groups_plugins_play to load vars for managed-node3 34886 1727204515.09107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204515.10791: done with get_vars() 34886 1727204515.10812: variable 'ansible_search_path' from source: unknown 34886 1727204515.10813: variable 'ansible_search_path' from source: unknown 34886 1727204515.10865: we have included files to process 34886 1727204515.10866: generating all_blocks data 34886 1727204515.10868: done generating all_blocks data 34886 1727204515.10869: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204515.10870: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204515.10871: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34886 1727204515.11103: done processing included file 34886 1727204515.11104: iterating over new_blocks loaded from include file 34886 1727204515.11106: in VariableManager get_vars() 34886 1727204515.11125: done with get_vars() 34886 1727204515.11126: filtering new block on tags 34886 1727204515.11143: done filtering new block on tags 34886 1727204515.11145: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 34886 1727204515.11151: extending task lists for all hosts with included blocks 34886 1727204515.11276: done extending task lists 34886 1727204515.11277: done processing included files 34886 1727204515.11278: results queue empty 34886 1727204515.11278: checking for any_errors_fatal 34886 1727204515.11281: done checking for any_errors_fatal 34886 1727204515.11281: checking for max_fail_percentage 34886 1727204515.11282: done checking for max_fail_percentage 34886 1727204515.11283: checking to see if all hosts have failed and the running result is not ok 34886 1727204515.11284: done checking to see if all hosts have failed 34886 1727204515.11284: getting the remaining hosts for this loop 34886 1727204515.11285: done getting the remaining hosts for this loop 34886 1727204515.11287: getting the next task for host managed-node3 34886 1727204515.11292: done getting next task for host managed-node3 34886 1727204515.11294: ^ task is: TASK: Gather current interface info 34886 1727204515.11297: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204515.11298: getting variables 34886 1727204515.11299: in VariableManager get_vars() 34886 1727204515.11310: Calling all_inventory to load vars for managed-node3 34886 1727204515.11312: Calling groups_inventory to load vars for managed-node3 34886 1727204515.11314: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204515.11318: Calling all_plugins_play to load vars for managed-node3 34886 1727204515.11322: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204515.11325: Calling groups_plugins_play to load vars for managed-node3 34886 1727204515.12408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204515.13976: done with get_vars() 34886 1727204515.14001: done getting variables 34886 1727204515.14040: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.074) 0:00:33.308 ***** 34886 1727204515.14068: entering _queue_task() for managed-node3/command 34886 1727204515.14356: worker is 1 (out of 1 available) 34886 1727204515.14373: exiting _queue_task() for managed-node3/command 34886 1727204515.14387: done queuing things up, now waiting for results queue to drain 34886 1727204515.14391: waiting for pending results... 34886 1727204515.14584: running TaskExecutor() for managed-node3/TASK: Gather current interface info 34886 1727204515.14683: in run() - task 12b410aa-8751-04b9-2e74-00000000071b 34886 1727204515.14698: variable 'ansible_search_path' from source: unknown 34886 1727204515.14703: variable 'ansible_search_path' from source: unknown 34886 1727204515.14739: calling self._execute() 34886 1727204515.14825: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.14832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.14844: variable 'omit' from source: magic vars 34886 1727204515.15173: variable 'ansible_distribution_major_version' from source: facts 34886 1727204515.15177: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204515.15186: variable 'omit' from source: magic vars 34886 1727204515.15235: variable 'omit' from source: magic vars 34886 1727204515.15265: variable 'omit' from source: magic vars 34886 1727204515.15306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204515.15338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204515.15358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204515.15375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204515.15387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204515.15418: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204515.15425: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.15428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.15511: Set connection var ansible_timeout to 10 34886 1727204515.15518: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204515.15523: Set connection var ansible_connection to ssh 34886 1727204515.15529: Set connection var ansible_shell_executable to /bin/sh 34886 1727204515.15537: Set connection var ansible_pipelining to False 34886 1727204515.15540: Set connection var ansible_shell_type to sh 34886 1727204515.15562: variable 'ansible_shell_executable' from source: unknown 34886 1727204515.15566: variable 'ansible_connection' from source: unknown 34886 1727204515.15569: variable 'ansible_module_compression' from source: unknown 34886 1727204515.15571: variable 'ansible_shell_type' from source: unknown 34886 1727204515.15575: variable 'ansible_shell_executable' from source: unknown 34886 1727204515.15579: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.15584: variable 'ansible_pipelining' from source: unknown 34886 1727204515.15588: variable 'ansible_timeout' from source: unknown 34886 1727204515.15595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.15714: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204515.15730: variable 'omit' from source: magic vars 34886 1727204515.15733: starting attempt loop 34886 1727204515.15736: running the handler 34886 1727204515.15749: _low_level_execute_command(): starting 34886 1727204515.15756: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204515.16323: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.16327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.16331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204515.16334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.16400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204515.16404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204515.16411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.16446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.18204: stdout chunk (state=3): >>>/root <<< 34886 1727204515.18316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.18372: stderr chunk (state=3): >>><<< 34886 1727204515.18376: stdout chunk (state=3): >>><<< 34886 1727204515.18399: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204515.18410: _low_level_execute_command(): starting 34886 1727204515.18417: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173 `" && echo ansible-tmp-1727204515.183981-36343-195814654963173="` echo /root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173 `" ) && sleep 0' 34886 1727204515.18850: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.18868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204515.18872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.18909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.18950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204515.18954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.18999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.21007: stdout chunk (state=3): >>>ansible-tmp-1727204515.183981-36343-195814654963173=/root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173 <<< 34886 1727204515.21127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.21176: stderr chunk (state=3): >>><<< 34886 1727204515.21180: stdout chunk (state=3): >>><<< 34886 1727204515.21196: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204515.183981-36343-195814654963173=/root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204515.21225: variable 'ansible_module_compression' from source: unknown 34886 1727204515.21263: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204515.21299: variable 'ansible_facts' from source: unknown 34886 1727204515.21363: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/AnsiballZ_command.py 34886 1727204515.21473: Sending initial data 34886 1727204515.21476: Sent initial data (155 bytes) 34886 1727204515.21945: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204515.21948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204515.21951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.21954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.21956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.22000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204515.22004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.22043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.23673: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34886 1727204515.23679: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204515.23695: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204515.23729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmppuif3e75 /root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/AnsiballZ_command.py <<< 34886 1727204515.23735: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/AnsiballZ_command.py" <<< 34886 1727204515.23762: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmppuif3e75" to remote "/root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/AnsiballZ_command.py" <<< 34886 1727204515.24532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.24594: stderr chunk (state=3): >>><<< 34886 1727204515.24597: stdout chunk (state=3): >>><<< 34886 1727204515.24615: done transferring module to remote 34886 1727204515.24628: _low_level_execute_command(): starting 34886 1727204515.24635: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/ /root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/AnsiballZ_command.py && sleep 0' 34886 1727204515.25087: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.25091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204515.25094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.25104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204515.25106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.25149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204515.25153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.25193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.27052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.27099: stderr chunk (state=3): >>><<< 34886 1727204515.27102: stdout chunk (state=3): >>><<< 34886 1727204515.27118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204515.27125: _low_level_execute_command(): starting 34886 1727204515.27130: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/AnsiballZ_command.py && sleep 0' 34886 1727204515.27622: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204515.27626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.27629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.27681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204515.27684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.27734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.45610: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:55.451208", "end": "2024-09-24 15:01:55.454817", "delta": "0:00:00.003609", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204515.47496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204515.47501: stdout chunk (state=3): >>><<< 34886 1727204515.47504: stderr chunk (state=3): >>><<< 34886 1727204515.47507: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:01:55.451208", "end": "2024-09-24 15:01:55.454817", "delta": "0:00:00.003609", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204515.47510: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204515.47513: _low_level_execute_command(): starting 34886 1727204515.47515: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204515.183981-36343-195814654963173/ > /dev/null 2>&1 && sleep 0' 34886 1727204515.48177: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204515.48187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204515.48202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.48221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204515.48232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204515.48265: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204515.48273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204515.48304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.48383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204515.48411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.48485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.50503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.50507: stdout chunk (state=3): >>><<< 34886 1727204515.50515: stderr chunk (state=3): >>><<< 34886 1727204515.50537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204515.50543: handler run complete 34886 1727204515.50574: Evaluated conditional (False): False 34886 1727204515.50588: attempt loop complete, returning result 34886 1727204515.50594: _execute() done 34886 1727204515.50596: dumping result to json 34886 1727204515.50604: done dumping result, returning 34886 1727204515.50615: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [12b410aa-8751-04b9-2e74-00000000071b] 34886 1727204515.50624: sending task result for task 12b410aa-8751-04b9-2e74-00000000071b 34886 1727204515.50742: done sending task result for task 12b410aa-8751-04b9-2e74-00000000071b 34886 1727204515.50893: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003609", "end": "2024-09-24 15:01:55.454817", "rc": 0, "start": "2024-09-24 15:01:55.451208" } STDOUT: bonding_masters eth0 lo veth0 34886 1727204515.50978: no more pending results, returning what we have 34886 1727204515.50982: results queue empty 34886 1727204515.50983: checking for any_errors_fatal 34886 1727204515.50985: done checking for any_errors_fatal 34886 1727204515.50986: checking for max_fail_percentage 34886 1727204515.50987: done checking for max_fail_percentage 34886 1727204515.50988: checking to see if all hosts have failed and the running result is not ok 34886 1727204515.50991: done checking to see if all hosts have failed 34886 1727204515.50992: getting the remaining hosts for this loop 34886 1727204515.50994: done getting the remaining hosts for this loop 34886 1727204515.50998: getting the next task for host managed-node3 34886 1727204515.51005: done getting next task for host managed-node3 34886 1727204515.51008: ^ task is: TASK: Set current_interfaces 34886 1727204515.51014: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204515.51018: getting variables 34886 1727204515.51022: in VariableManager get_vars() 34886 1727204515.51062: Calling all_inventory to load vars for managed-node3 34886 1727204515.51070: Calling groups_inventory to load vars for managed-node3 34886 1727204515.51073: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204515.51085: Calling all_plugins_play to load vars for managed-node3 34886 1727204515.51090: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204515.51096: Calling groups_plugins_play to load vars for managed-node3 34886 1727204515.53523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204515.56702: done with get_vars() 34886 1727204515.56752: done getting variables 34886 1727204515.56841: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.428) 0:00:33.736 ***** 34886 1727204515.56884: entering _queue_task() for managed-node3/set_fact 34886 1727204515.57330: worker is 1 (out of 1 available) 34886 1727204515.57344: exiting _queue_task() for managed-node3/set_fact 34886 1727204515.57358: done queuing things up, now waiting for results queue to drain 34886 1727204515.57359: waiting for pending results... 34886 1727204515.57673: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 34886 1727204515.58100: in run() - task 12b410aa-8751-04b9-2e74-00000000071c 34886 1727204515.58105: variable 'ansible_search_path' from source: unknown 34886 1727204515.58109: variable 'ansible_search_path' from source: unknown 34886 1727204515.58112: calling self._execute() 34886 1727204515.58236: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.58254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.58275: variable 'omit' from source: magic vars 34886 1727204515.58794: variable 'ansible_distribution_major_version' from source: facts 34886 1727204515.58814: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204515.58830: variable 'omit' from source: magic vars 34886 1727204515.58917: variable 'omit' from source: magic vars 34886 1727204515.59082: variable '_current_interfaces' from source: set_fact 34886 1727204515.59185: variable 'omit' from source: magic vars 34886 1727204515.59294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204515.59300: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204515.59330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204515.59360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204515.59381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204515.59437: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204515.59447: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.59457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.59623: Set connection var ansible_timeout to 10 34886 1727204515.59637: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204515.59696: Set connection var ansible_connection to ssh 34886 1727204515.59700: Set connection var ansible_shell_executable to /bin/sh 34886 1727204515.59702: Set connection var ansible_pipelining to False 34886 1727204515.59704: Set connection var ansible_shell_type to sh 34886 1727204515.59712: variable 'ansible_shell_executable' from source: unknown 34886 1727204515.59733: variable 'ansible_connection' from source: unknown 34886 1727204515.59835: variable 'ansible_module_compression' from source: unknown 34886 1727204515.59838: variable 'ansible_shell_type' from source: unknown 34886 1727204515.59841: variable 'ansible_shell_executable' from source: unknown 34886 1727204515.59845: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.59847: variable 'ansible_pipelining' from source: unknown 34886 1727204515.59850: variable 'ansible_timeout' from source: unknown 34886 1727204515.59853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.59998: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204515.60017: variable 'omit' from source: magic vars 34886 1727204515.60082: starting attempt loop 34886 1727204515.60085: running the handler 34886 1727204515.60088: handler run complete 34886 1727204515.60093: attempt loop complete, returning result 34886 1727204515.60095: _execute() done 34886 1727204515.60097: dumping result to json 34886 1727204515.60100: done dumping result, returning 34886 1727204515.60111: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [12b410aa-8751-04b9-2e74-00000000071c] 34886 1727204515.60129: sending task result for task 12b410aa-8751-04b9-2e74-00000000071c 34886 1727204515.60369: done sending task result for task 12b410aa-8751-04b9-2e74-00000000071c 34886 1727204515.60373: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 34886 1727204515.60449: no more pending results, returning what we have 34886 1727204515.60453: results queue empty 34886 1727204515.60455: checking for any_errors_fatal 34886 1727204515.60465: done checking for any_errors_fatal 34886 1727204515.60466: checking for max_fail_percentage 34886 1727204515.60468: done checking for max_fail_percentage 34886 1727204515.60470: checking to see if all hosts have failed and the running result is not ok 34886 1727204515.60471: done checking to see if all hosts have failed 34886 1727204515.60472: getting the remaining hosts for this loop 34886 1727204515.60474: done getting the remaining hosts for this loop 34886 1727204515.60480: getting the next task for host managed-node3 34886 1727204515.60493: done getting next task for host managed-node3 34886 1727204515.60496: ^ task is: TASK: Show current_interfaces 34886 1727204515.60502: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204515.60508: getting variables 34886 1727204515.60510: in VariableManager get_vars() 34886 1727204515.60567: Calling all_inventory to load vars for managed-node3 34886 1727204515.60571: Calling groups_inventory to load vars for managed-node3 34886 1727204515.60574: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204515.60792: Calling all_plugins_play to load vars for managed-node3 34886 1727204515.60799: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204515.60804: Calling groups_plugins_play to load vars for managed-node3 34886 1727204515.63369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204515.66843: done with get_vars() 34886 1727204515.66885: done getting variables 34886 1727204515.66956: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.101) 0:00:33.837 ***** 34886 1727204515.67003: entering _queue_task() for managed-node3/debug 34886 1727204515.67407: worker is 1 (out of 1 available) 34886 1727204515.67426: exiting _queue_task() for managed-node3/debug 34886 1727204515.67440: done queuing things up, now waiting for results queue to drain 34886 1727204515.67442: waiting for pending results... 34886 1727204515.67869: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 34886 1727204515.67964: in run() - task 12b410aa-8751-04b9-2e74-0000000006e5 34886 1727204515.67969: variable 'ansible_search_path' from source: unknown 34886 1727204515.67972: variable 'ansible_search_path' from source: unknown 34886 1727204515.68015: calling self._execute() 34886 1727204515.68181: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.68186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.68190: variable 'omit' from source: magic vars 34886 1727204515.68649: variable 'ansible_distribution_major_version' from source: facts 34886 1727204515.68668: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204515.68680: variable 'omit' from source: magic vars 34886 1727204515.68753: variable 'omit' from source: magic vars 34886 1727204515.68948: variable 'current_interfaces' from source: set_fact 34886 1727204515.68952: variable 'omit' from source: magic vars 34886 1727204515.68988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204515.69040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204515.69076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204515.69105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204515.69126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204515.69177: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204515.69186: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.69197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.69379: Set connection var ansible_timeout to 10 34886 1727204515.69383: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204515.69385: Set connection var ansible_connection to ssh 34886 1727204515.69388: Set connection var ansible_shell_executable to /bin/sh 34886 1727204515.69392: Set connection var ansible_pipelining to False 34886 1727204515.69394: Set connection var ansible_shell_type to sh 34886 1727204515.69594: variable 'ansible_shell_executable' from source: unknown 34886 1727204515.69598: variable 'ansible_connection' from source: unknown 34886 1727204515.69601: variable 'ansible_module_compression' from source: unknown 34886 1727204515.69603: variable 'ansible_shell_type' from source: unknown 34886 1727204515.69605: variable 'ansible_shell_executable' from source: unknown 34886 1727204515.69607: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.69609: variable 'ansible_pipelining' from source: unknown 34886 1727204515.69611: variable 'ansible_timeout' from source: unknown 34886 1727204515.69614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.69650: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204515.69669: variable 'omit' from source: magic vars 34886 1727204515.69679: starting attempt loop 34886 1727204515.69687: running the handler 34886 1727204515.69754: handler run complete 34886 1727204515.69778: attempt loop complete, returning result 34886 1727204515.69785: _execute() done 34886 1727204515.69795: dumping result to json 34886 1727204515.69802: done dumping result, returning 34886 1727204515.69814: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [12b410aa-8751-04b9-2e74-0000000006e5] 34886 1727204515.69827: sending task result for task 12b410aa-8751-04b9-2e74-0000000006e5 ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 34886 1727204515.70111: no more pending results, returning what we have 34886 1727204515.70116: results queue empty 34886 1727204515.70117: checking for any_errors_fatal 34886 1727204515.70128: done checking for any_errors_fatal 34886 1727204515.70129: checking for max_fail_percentage 34886 1727204515.70131: done checking for max_fail_percentage 34886 1727204515.70133: checking to see if all hosts have failed and the running result is not ok 34886 1727204515.70134: done checking to see if all hosts have failed 34886 1727204515.70135: getting the remaining hosts for this loop 34886 1727204515.70136: done getting the remaining hosts for this loop 34886 1727204515.70141: getting the next task for host managed-node3 34886 1727204515.70151: done getting next task for host managed-node3 34886 1727204515.70154: ^ task is: TASK: Install iproute 34886 1727204515.70158: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204515.70164: getting variables 34886 1727204515.70166: in VariableManager get_vars() 34886 1727204515.70215: Calling all_inventory to load vars for managed-node3 34886 1727204515.70221: Calling groups_inventory to load vars for managed-node3 34886 1727204515.70224: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204515.70238: Calling all_plugins_play to load vars for managed-node3 34886 1727204515.70242: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204515.70246: Calling groups_plugins_play to load vars for managed-node3 34886 1727204515.70906: done sending task result for task 12b410aa-8751-04b9-2e74-0000000006e5 34886 1727204515.70910: WORKER PROCESS EXITING 34886 1727204515.72728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204515.75842: done with get_vars() 34886 1727204515.75890: done getting variables 34886 1727204515.75967: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:01:55 -0400 (0:00:00.089) 0:00:33.927 ***** 34886 1727204515.76005: entering _queue_task() for managed-node3/package 34886 1727204515.76610: worker is 1 (out of 1 available) 34886 1727204515.76625: exiting _queue_task() for managed-node3/package 34886 1727204515.76637: done queuing things up, now waiting for results queue to drain 34886 1727204515.76639: waiting for pending results... 34886 1727204515.76768: running TaskExecutor() for managed-node3/TASK: Install iproute 34886 1727204515.77002: in run() - task 12b410aa-8751-04b9-2e74-0000000005cf 34886 1727204515.77006: variable 'ansible_search_path' from source: unknown 34886 1727204515.77009: variable 'ansible_search_path' from source: unknown 34886 1727204515.77011: calling self._execute() 34886 1727204515.77126: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.77141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.77159: variable 'omit' from source: magic vars 34886 1727204515.77654: variable 'ansible_distribution_major_version' from source: facts 34886 1727204515.77677: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204515.77693: variable 'omit' from source: magic vars 34886 1727204515.77746: variable 'omit' from source: magic vars 34886 1727204515.78034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34886 1727204515.80775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34886 1727204515.80876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34886 1727204515.80941: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34886 1727204515.81473: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34886 1727204515.81478: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34886 1727204515.81584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34886 1727204515.81635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34886 1727204515.81690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34886 1727204515.81742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34886 1727204515.81797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34886 1727204515.81915: variable '__network_is_ostree' from source: set_fact 34886 1727204515.81933: variable 'omit' from source: magic vars 34886 1727204515.81974: variable 'omit' from source: magic vars 34886 1727204515.82023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204515.82094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204515.82098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204515.82117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204515.82146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204515.82191: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204515.82236: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.82240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.82367: Set connection var ansible_timeout to 10 34886 1727204515.82380: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204515.82388: Set connection var ansible_connection to ssh 34886 1727204515.82403: Set connection var ansible_shell_executable to /bin/sh 34886 1727204515.82418: Set connection var ansible_pipelining to False 34886 1727204515.82449: Set connection var ansible_shell_type to sh 34886 1727204515.82473: variable 'ansible_shell_executable' from source: unknown 34886 1727204515.82483: variable 'ansible_connection' from source: unknown 34886 1727204515.82558: variable 'ansible_module_compression' from source: unknown 34886 1727204515.82562: variable 'ansible_shell_type' from source: unknown 34886 1727204515.82564: variable 'ansible_shell_executable' from source: unknown 34886 1727204515.82566: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204515.82570: variable 'ansible_pipelining' from source: unknown 34886 1727204515.82573: variable 'ansible_timeout' from source: unknown 34886 1727204515.82575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204515.82678: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204515.82702: variable 'omit' from source: magic vars 34886 1727204515.82714: starting attempt loop 34886 1727204515.82777: running the handler 34886 1727204515.82781: variable 'ansible_facts' from source: unknown 34886 1727204515.82784: variable 'ansible_facts' from source: unknown 34886 1727204515.82801: _low_level_execute_command(): starting 34886 1727204515.82815: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204515.83610: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 34886 1727204515.83701: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.83713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204515.83763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204515.83767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.83826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.85597: stdout chunk (state=3): >>>/root <<< 34886 1727204515.85803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.85807: stdout chunk (state=3): >>><<< 34886 1727204515.85809: stderr chunk (state=3): >>><<< 34886 1727204515.85944: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204515.85957: _low_level_execute_command(): starting 34886 1727204515.85960: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589 `" && echo ansible-tmp-1727204515.8583915-36365-73166679642589="` echo /root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589 `" ) && sleep 0' 34886 1727204515.86551: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204515.86608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.86684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204515.86708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204515.86755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.86822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.88843: stdout chunk (state=3): >>>ansible-tmp-1727204515.8583915-36365-73166679642589=/root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589 <<< 34886 1727204515.89066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.89070: stdout chunk (state=3): >>><<< 34886 1727204515.89073: stderr chunk (state=3): >>><<< 34886 1727204515.89094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204515.8583915-36365-73166679642589=/root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204515.89142: variable 'ansible_module_compression' from source: unknown 34886 1727204515.89395: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 34886 1727204515.89399: variable 'ansible_facts' from source: unknown 34886 1727204515.89410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/AnsiballZ_dnf.py 34886 1727204515.89653: Sending initial data 34886 1727204515.89656: Sent initial data (151 bytes) 34886 1727204515.90270: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204515.90405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204515.90410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.90435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204515.90450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204515.90472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.90539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.92185: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204515.92237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204515.92301: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp1c53lj6l /root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/AnsiballZ_dnf.py <<< 34886 1727204515.92328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp1c53lj6l" to remote "/root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/AnsiballZ_dnf.py" <<< 34886 1727204515.93809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.93840: stderr chunk (state=3): >>><<< 34886 1727204515.93852: stdout chunk (state=3): >>><<< 34886 1727204515.93891: done transferring module to remote 34886 1727204515.94015: _low_level_execute_command(): starting 34886 1727204515.94019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/ /root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/AnsiballZ_dnf.py && sleep 0' 34886 1727204515.94658: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.94677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.94698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.94742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204515.94757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.94810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204515.96899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204515.96903: stdout chunk (state=3): >>><<< 34886 1727204515.96905: stderr chunk (state=3): >>><<< 34886 1727204515.96908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204515.96911: _low_level_execute_command(): starting 34886 1727204515.96914: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/AnsiballZ_dnf.py && sleep 0' 34886 1727204515.97449: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204515.97456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204515.97463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204515.97505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204515.97556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204515.97576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204515.97606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204517.45114: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 34886 1727204517.50122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204517.50244: stderr chunk (state=3): >>><<< 34886 1727204517.50248: stdout chunk (state=3): >>><<< 34886 1727204517.50396: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204517.50400: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204517.50404: _low_level_execute_command(): starting 34886 1727204517.50406: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204515.8583915-36365-73166679642589/ > /dev/null 2>&1 && sleep 0' 34886 1727204517.51096: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204517.51113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204517.51139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204517.51198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204517.51277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204517.51309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204517.51357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204517.51413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204517.53450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204517.53696: stdout chunk (state=3): >>><<< 34886 1727204517.53699: stderr chunk (state=3): >>><<< 34886 1727204517.53703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204517.53705: handler run complete 34886 1727204517.53782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34886 1727204517.54080: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34886 1727204517.54150: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34886 1727204517.54207: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34886 1727204517.54252: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34886 1727204517.54369: variable '__install_status' from source: set_fact 34886 1727204517.54411: Evaluated conditional (__install_status is success): True 34886 1727204517.54454: attempt loop complete, returning result 34886 1727204517.54463: _execute() done 34886 1727204517.54471: dumping result to json 34886 1727204517.54482: done dumping result, returning 34886 1727204517.54508: done running TaskExecutor() for managed-node3/TASK: Install iproute [12b410aa-8751-04b9-2e74-0000000005cf] 34886 1727204517.54523: sending task result for task 12b410aa-8751-04b9-2e74-0000000005cf ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 34886 1727204517.54837: no more pending results, returning what we have 34886 1727204517.54841: results queue empty 34886 1727204517.54843: checking for any_errors_fatal 34886 1727204517.54850: done checking for any_errors_fatal 34886 1727204517.54852: checking for max_fail_percentage 34886 1727204517.54854: done checking for max_fail_percentage 34886 1727204517.54855: checking to see if all hosts have failed and the running result is not ok 34886 1727204517.54856: done checking to see if all hosts have failed 34886 1727204517.54857: getting the remaining hosts for this loop 34886 1727204517.54859: done getting the remaining hosts for this loop 34886 1727204517.54865: getting the next task for host managed-node3 34886 1727204517.54872: done getting next task for host managed-node3 34886 1727204517.54875: ^ task is: TASK: Create veth interface {{ interface }} 34886 1727204517.54879: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204517.54884: getting variables 34886 1727204517.54886: in VariableManager get_vars() 34886 1727204517.55086: Calling all_inventory to load vars for managed-node3 34886 1727204517.55152: Calling groups_inventory to load vars for managed-node3 34886 1727204517.55157: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204517.55197: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005cf 34886 1727204517.55201: WORKER PROCESS EXITING 34886 1727204517.55214: Calling all_plugins_play to load vars for managed-node3 34886 1727204517.55226: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204517.55232: Calling groups_plugins_play to load vars for managed-node3 34886 1727204517.58235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204517.61370: done with get_vars() 34886 1727204517.61408: done getting variables 34886 1727204517.61484: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204517.61627: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:01:57 -0400 (0:00:01.856) 0:00:35.784 ***** 34886 1727204517.61665: entering _queue_task() for managed-node3/command 34886 1727204517.62060: worker is 1 (out of 1 available) 34886 1727204517.62075: exiting _queue_task() for managed-node3/command 34886 1727204517.62294: done queuing things up, now waiting for results queue to drain 34886 1727204517.62297: waiting for pending results... 34886 1727204517.62445: running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 34886 1727204517.62591: in run() - task 12b410aa-8751-04b9-2e74-0000000005d0 34886 1727204517.62616: variable 'ansible_search_path' from source: unknown 34886 1727204517.62695: variable 'ansible_search_path' from source: unknown 34886 1727204517.62993: variable 'interface' from source: play vars 34886 1727204517.63116: variable 'interface' from source: play vars 34886 1727204517.63227: variable 'interface' from source: play vars 34886 1727204517.63439: Loaded config def from plugin (lookup/items) 34886 1727204517.63454: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34886 1727204517.63492: variable 'omit' from source: magic vars 34886 1727204517.63675: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204517.63702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204517.63729: variable 'omit' from source: magic vars 34886 1727204517.64097: variable 'ansible_distribution_major_version' from source: facts 34886 1727204517.64100: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204517.64374: variable 'type' from source: play vars 34886 1727204517.64393: variable 'state' from source: include params 34886 1727204517.64405: variable 'interface' from source: play vars 34886 1727204517.64416: variable 'current_interfaces' from source: set_fact 34886 1727204517.64451: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 34886 1727204517.64455: when evaluation is False, skipping this task 34886 1727204517.64493: variable 'item' from source: unknown 34886 1727204517.64597: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 34886 1727204517.65200: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204517.65204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204517.65207: variable 'omit' from source: magic vars 34886 1727204517.65209: variable 'ansible_distribution_major_version' from source: facts 34886 1727204517.65211: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204517.65375: variable 'type' from source: play vars 34886 1727204517.65386: variable 'state' from source: include params 34886 1727204517.65398: variable 'interface' from source: play vars 34886 1727204517.65406: variable 'current_interfaces' from source: set_fact 34886 1727204517.65427: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 34886 1727204517.65434: when evaluation is False, skipping this task 34886 1727204517.65475: variable 'item' from source: unknown 34886 1727204517.65561: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 34886 1727204517.65747: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204517.65758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204517.65778: variable 'omit' from source: magic vars 34886 1727204517.65997: variable 'ansible_distribution_major_version' from source: facts 34886 1727204517.66010: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204517.66276: variable 'type' from source: play vars 34886 1727204517.66297: variable 'state' from source: include params 34886 1727204517.66309: variable 'interface' from source: play vars 34886 1727204517.66322: variable 'current_interfaces' from source: set_fact 34886 1727204517.66335: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 34886 1727204517.66343: when evaluation is False, skipping this task 34886 1727204517.66383: variable 'item' from source: unknown 34886 1727204517.66494: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 34886 1727204517.66798: dumping result to json 34886 1727204517.66801: done dumping result, returning 34886 1727204517.66804: done running TaskExecutor() for managed-node3/TASK: Create veth interface veth0 [12b410aa-8751-04b9-2e74-0000000005d0] 34886 1727204517.66807: sending task result for task 12b410aa-8751-04b9-2e74-0000000005d0 34886 1727204517.66854: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005d0 34886 1727204517.66857: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false } MSG: All items skipped 34886 1727204517.66912: no more pending results, returning what we have 34886 1727204517.66917: results queue empty 34886 1727204517.66921: checking for any_errors_fatal 34886 1727204517.66936: done checking for any_errors_fatal 34886 1727204517.66937: checking for max_fail_percentage 34886 1727204517.66939: done checking for max_fail_percentage 34886 1727204517.66940: checking to see if all hosts have failed and the running result is not ok 34886 1727204517.66941: done checking to see if all hosts have failed 34886 1727204517.66942: getting the remaining hosts for this loop 34886 1727204517.66943: done getting the remaining hosts for this loop 34886 1727204517.66948: getting the next task for host managed-node3 34886 1727204517.66955: done getting next task for host managed-node3 34886 1727204517.66959: ^ task is: TASK: Set up veth as managed by NetworkManager 34886 1727204517.66962: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204517.66969: getting variables 34886 1727204517.66971: in VariableManager get_vars() 34886 1727204517.67025: Calling all_inventory to load vars for managed-node3 34886 1727204517.67029: Calling groups_inventory to load vars for managed-node3 34886 1727204517.67032: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204517.67046: Calling all_plugins_play to load vars for managed-node3 34886 1727204517.67051: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204517.67055: Calling groups_plugins_play to load vars for managed-node3 34886 1727204517.70366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204517.74623: done with get_vars() 34886 1727204517.74729: done getting variables 34886 1727204517.75065: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:01:57 -0400 (0:00:00.134) 0:00:35.918 ***** 34886 1727204517.75111: entering _queue_task() for managed-node3/command 34886 1727204517.75796: worker is 1 (out of 1 available) 34886 1727204517.75810: exiting _queue_task() for managed-node3/command 34886 1727204517.75826: done queuing things up, now waiting for results queue to drain 34886 1727204517.75828: waiting for pending results... 34886 1727204517.76314: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 34886 1727204517.76607: in run() - task 12b410aa-8751-04b9-2e74-0000000005d1 34886 1727204517.76629: variable 'ansible_search_path' from source: unknown 34886 1727204517.76632: variable 'ansible_search_path' from source: unknown 34886 1727204517.76738: calling self._execute() 34886 1727204517.76776: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204517.76784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204517.76964: variable 'omit' from source: magic vars 34886 1727204517.77427: variable 'ansible_distribution_major_version' from source: facts 34886 1727204517.77442: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204517.77646: variable 'type' from source: play vars 34886 1727204517.77654: variable 'state' from source: include params 34886 1727204517.77661: Evaluated conditional (type == 'veth' and state == 'present'): False 34886 1727204517.77665: when evaluation is False, skipping this task 34886 1727204517.77669: _execute() done 34886 1727204517.77671: dumping result to json 34886 1727204517.77716: done dumping result, returning 34886 1727204517.77724: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [12b410aa-8751-04b9-2e74-0000000005d1] 34886 1727204517.77728: sending task result for task 12b410aa-8751-04b9-2e74-0000000005d1 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 34886 1727204517.77958: no more pending results, returning what we have 34886 1727204517.77961: results queue empty 34886 1727204517.77962: checking for any_errors_fatal 34886 1727204517.77972: done checking for any_errors_fatal 34886 1727204517.77973: checking for max_fail_percentage 34886 1727204517.77975: done checking for max_fail_percentage 34886 1727204517.77976: checking to see if all hosts have failed and the running result is not ok 34886 1727204517.77977: done checking to see if all hosts have failed 34886 1727204517.77978: getting the remaining hosts for this loop 34886 1727204517.77979: done getting the remaining hosts for this loop 34886 1727204517.77983: getting the next task for host managed-node3 34886 1727204517.77991: done getting next task for host managed-node3 34886 1727204517.77994: ^ task is: TASK: Delete veth interface {{ interface }} 34886 1727204517.77997: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204517.78001: getting variables 34886 1727204517.78002: in VariableManager get_vars() 34886 1727204517.78045: Calling all_inventory to load vars for managed-node3 34886 1727204517.78048: Calling groups_inventory to load vars for managed-node3 34886 1727204517.78051: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204517.78063: Calling all_plugins_play to load vars for managed-node3 34886 1727204517.78067: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204517.78071: Calling groups_plugins_play to load vars for managed-node3 34886 1727204517.78610: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005d1 34886 1727204517.78614: WORKER PROCESS EXITING 34886 1727204517.80464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204517.83718: done with get_vars() 34886 1727204517.83776: done getting variables 34886 1727204517.83855: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204517.84012: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:01:57 -0400 (0:00:00.089) 0:00:36.008 ***** 34886 1727204517.84053: entering _queue_task() for managed-node3/command 34886 1727204517.84485: worker is 1 (out of 1 available) 34886 1727204517.84504: exiting _queue_task() for managed-node3/command 34886 1727204517.84526: done queuing things up, now waiting for results queue to drain 34886 1727204517.84529: waiting for pending results... 34886 1727204517.84798: running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 34886 1727204517.84966: in run() - task 12b410aa-8751-04b9-2e74-0000000005d2 34886 1727204517.84980: variable 'ansible_search_path' from source: unknown 34886 1727204517.84983: variable 'ansible_search_path' from source: unknown 34886 1727204517.84987: calling self._execute() 34886 1727204517.85073: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204517.85082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204517.85095: variable 'omit' from source: magic vars 34886 1727204517.85512: variable 'ansible_distribution_major_version' from source: facts 34886 1727204517.85594: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204517.85785: variable 'type' from source: play vars 34886 1727204517.85790: variable 'state' from source: include params 34886 1727204517.85798: variable 'interface' from source: play vars 34886 1727204517.85802: variable 'current_interfaces' from source: set_fact 34886 1727204517.85813: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 34886 1727204517.85823: variable 'omit' from source: magic vars 34886 1727204517.85867: variable 'omit' from source: magic vars 34886 1727204517.85984: variable 'interface' from source: play vars 34886 1727204517.86006: variable 'omit' from source: magic vars 34886 1727204517.86052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204517.86094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204517.86117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204517.86162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204517.86167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204517.86185: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204517.86268: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204517.86272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204517.86332: Set connection var ansible_timeout to 10 34886 1727204517.86339: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204517.86343: Set connection var ansible_connection to ssh 34886 1727204517.86351: Set connection var ansible_shell_executable to /bin/sh 34886 1727204517.86362: Set connection var ansible_pipelining to False 34886 1727204517.86365: Set connection var ansible_shell_type to sh 34886 1727204517.86397: variable 'ansible_shell_executable' from source: unknown 34886 1727204517.86401: variable 'ansible_connection' from source: unknown 34886 1727204517.86405: variable 'ansible_module_compression' from source: unknown 34886 1727204517.86408: variable 'ansible_shell_type' from source: unknown 34886 1727204517.86413: variable 'ansible_shell_executable' from source: unknown 34886 1727204517.86417: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204517.86423: variable 'ansible_pipelining' from source: unknown 34886 1727204517.86426: variable 'ansible_timeout' from source: unknown 34886 1727204517.86526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204517.86591: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204517.86610: variable 'omit' from source: magic vars 34886 1727204517.86616: starting attempt loop 34886 1727204517.86622: running the handler 34886 1727204517.86648: _low_level_execute_command(): starting 34886 1727204517.86651: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204517.87651: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204517.87671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204517.87675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204517.87812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204517.87856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204517.89640: stdout chunk (state=3): >>>/root <<< 34886 1727204517.89843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204517.89847: stdout chunk (state=3): >>><<< 34886 1727204517.89849: stderr chunk (state=3): >>><<< 34886 1727204517.89870: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204517.89892: _low_level_execute_command(): starting 34886 1727204517.89905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728 `" && echo ansible-tmp-1727204517.898773-36407-1719545657728="` echo /root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728 `" ) && sleep 0' 34886 1727204517.90578: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204517.90596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204517.90612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204517.90648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204517.90709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204517.90779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204517.90800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204517.90824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204517.90896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204517.92930: stdout chunk (state=3): >>>ansible-tmp-1727204517.898773-36407-1719545657728=/root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728 <<< 34886 1727204517.93138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204517.93142: stdout chunk (state=3): >>><<< 34886 1727204517.93145: stderr chunk (state=3): >>><<< 34886 1727204517.93261: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204517.898773-36407-1719545657728=/root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204517.93264: variable 'ansible_module_compression' from source: unknown 34886 1727204517.93297: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204517.93345: variable 'ansible_facts' from source: unknown 34886 1727204517.93457: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/AnsiballZ_command.py 34886 1727204517.93629: Sending initial data 34886 1727204517.93763: Sent initial data (153 bytes) 34886 1727204517.94406: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204517.94496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204517.94517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204517.94590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204517.96274: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204517.96344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204517.96397: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpjtfwhtn7 /root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/AnsiballZ_command.py <<< 34886 1727204517.96401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/AnsiballZ_command.py" <<< 34886 1727204517.96432: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmpjtfwhtn7" to remote "/root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/AnsiballZ_command.py" <<< 34886 1727204517.97605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204517.97622: stderr chunk (state=3): >>><<< 34886 1727204517.97637: stdout chunk (state=3): >>><<< 34886 1727204517.97669: done transferring module to remote 34886 1727204517.97695: _low_level_execute_command(): starting 34886 1727204517.97723: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/ /root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/AnsiballZ_command.py && sleep 0' 34886 1727204517.98254: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204517.98261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204517.98288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204517.98293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204517.98297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204517.98350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204517.98365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204517.98409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.00363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204518.00368: stdout chunk (state=3): >>><<< 34886 1727204518.00506: stderr chunk (state=3): >>><<< 34886 1727204518.00511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204518.00514: _low_level_execute_command(): starting 34886 1727204518.00517: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/AnsiballZ_command.py && sleep 0' 34886 1727204518.00954: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204518.00980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 34886 1727204518.00995: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.01033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204518.01052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204518.01095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.20267: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 15:01:58.181086", "end": "2024-09-24 15:01:58.200634", "delta": "0:00:00.019548", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204518.21961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204518.22017: stderr chunk (state=3): >>><<< 34886 1727204518.22021: stdout chunk (state=3): >>><<< 34886 1727204518.22043: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 15:01:58.181086", "end": "2024-09-24 15:01:58.200634", "delta": "0:00:00.019548", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204518.22077: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204518.22085: _low_level_execute_command(): starting 34886 1727204518.22092: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204517.898773-36407-1719545657728/ > /dev/null 2>&1 && sleep 0' 34886 1727204518.22550: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204518.22555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.22558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204518.22560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.22615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204518.22618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204518.22661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.24641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204518.24695: stderr chunk (state=3): >>><<< 34886 1727204518.24699: stdout chunk (state=3): >>><<< 34886 1727204518.24717: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204518.24730: handler run complete 34886 1727204518.24754: Evaluated conditional (False): False 34886 1727204518.24767: attempt loop complete, returning result 34886 1727204518.24771: _execute() done 34886 1727204518.24773: dumping result to json 34886 1727204518.24780: done dumping result, returning 34886 1727204518.24788: done running TaskExecutor() for managed-node3/TASK: Delete veth interface veth0 [12b410aa-8751-04b9-2e74-0000000005d2] 34886 1727204518.24796: sending task result for task 12b410aa-8751-04b9-2e74-0000000005d2 34886 1727204518.24909: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005d2 34886 1727204518.24912: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.019548", "end": "2024-09-24 15:01:58.200634", "rc": 0, "start": "2024-09-24 15:01:58.181086" } 34886 1727204518.25021: no more pending results, returning what we have 34886 1727204518.25026: results queue empty 34886 1727204518.25027: checking for any_errors_fatal 34886 1727204518.25032: done checking for any_errors_fatal 34886 1727204518.25033: checking for max_fail_percentage 34886 1727204518.25035: done checking for max_fail_percentage 34886 1727204518.25036: checking to see if all hosts have failed and the running result is not ok 34886 1727204518.25037: done checking to see if all hosts have failed 34886 1727204518.25038: getting the remaining hosts for this loop 34886 1727204518.25039: done getting the remaining hosts for this loop 34886 1727204518.25045: getting the next task for host managed-node3 34886 1727204518.25052: done getting next task for host managed-node3 34886 1727204518.25055: ^ task is: TASK: Create dummy interface {{ interface }} 34886 1727204518.25059: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204518.25063: getting variables 34886 1727204518.25065: in VariableManager get_vars() 34886 1727204518.25109: Calling all_inventory to load vars for managed-node3 34886 1727204518.25112: Calling groups_inventory to load vars for managed-node3 34886 1727204518.25115: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204518.25127: Calling all_plugins_play to load vars for managed-node3 34886 1727204518.25130: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204518.25133: Calling groups_plugins_play to load vars for managed-node3 34886 1727204518.26545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204518.28113: done with get_vars() 34886 1727204518.28144: done getting variables 34886 1727204518.28206: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204518.28305: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.442) 0:00:36.451 ***** 34886 1727204518.28334: entering _queue_task() for managed-node3/command 34886 1727204518.28616: worker is 1 (out of 1 available) 34886 1727204518.28633: exiting _queue_task() for managed-node3/command 34886 1727204518.28648: done queuing things up, now waiting for results queue to drain 34886 1727204518.28650: waiting for pending results... 34886 1727204518.28845: running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 34886 1727204518.28933: in run() - task 12b410aa-8751-04b9-2e74-0000000005d3 34886 1727204518.28945: variable 'ansible_search_path' from source: unknown 34886 1727204518.28949: variable 'ansible_search_path' from source: unknown 34886 1727204518.28990: calling self._execute() 34886 1727204518.29071: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204518.29079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204518.29096: variable 'omit' from source: magic vars 34886 1727204518.29407: variable 'ansible_distribution_major_version' from source: facts 34886 1727204518.29422: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204518.29593: variable 'type' from source: play vars 34886 1727204518.29597: variable 'state' from source: include params 34886 1727204518.29604: variable 'interface' from source: play vars 34886 1727204518.29607: variable 'current_interfaces' from source: set_fact 34886 1727204518.29617: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 34886 1727204518.29622: when evaluation is False, skipping this task 34886 1727204518.29625: _execute() done 34886 1727204518.29628: dumping result to json 34886 1727204518.29631: done dumping result, returning 34886 1727204518.29637: done running TaskExecutor() for managed-node3/TASK: Create dummy interface veth0 [12b410aa-8751-04b9-2e74-0000000005d3] 34886 1727204518.29649: sending task result for task 12b410aa-8751-04b9-2e74-0000000005d3 34886 1727204518.29737: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005d3 34886 1727204518.29740: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204518.29808: no more pending results, returning what we have 34886 1727204518.29813: results queue empty 34886 1727204518.29814: checking for any_errors_fatal 34886 1727204518.29825: done checking for any_errors_fatal 34886 1727204518.29826: checking for max_fail_percentage 34886 1727204518.29827: done checking for max_fail_percentage 34886 1727204518.29828: checking to see if all hosts have failed and the running result is not ok 34886 1727204518.29829: done checking to see if all hosts have failed 34886 1727204518.29830: getting the remaining hosts for this loop 34886 1727204518.29832: done getting the remaining hosts for this loop 34886 1727204518.29836: getting the next task for host managed-node3 34886 1727204518.29842: done getting next task for host managed-node3 34886 1727204518.29845: ^ task is: TASK: Delete dummy interface {{ interface }} 34886 1727204518.29849: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204518.29852: getting variables 34886 1727204518.29854: in VariableManager get_vars() 34886 1727204518.29896: Calling all_inventory to load vars for managed-node3 34886 1727204518.29899: Calling groups_inventory to load vars for managed-node3 34886 1727204518.29901: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204518.29912: Calling all_plugins_play to load vars for managed-node3 34886 1727204518.29916: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204518.29921: Calling groups_plugins_play to load vars for managed-node3 34886 1727204518.31125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204518.32830: done with get_vars() 34886 1727204518.32855: done getting variables 34886 1727204518.32915: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204518.33010: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.047) 0:00:36.498 ***** 34886 1727204518.33041: entering _queue_task() for managed-node3/command 34886 1727204518.33322: worker is 1 (out of 1 available) 34886 1727204518.33339: exiting _queue_task() for managed-node3/command 34886 1727204518.33353: done queuing things up, now waiting for results queue to drain 34886 1727204518.33355: waiting for pending results... 34886 1727204518.33548: running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 34886 1727204518.33638: in run() - task 12b410aa-8751-04b9-2e74-0000000005d4 34886 1727204518.33652: variable 'ansible_search_path' from source: unknown 34886 1727204518.33656: variable 'ansible_search_path' from source: unknown 34886 1727204518.33694: calling self._execute() 34886 1727204518.33781: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204518.33793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204518.33807: variable 'omit' from source: magic vars 34886 1727204518.34127: variable 'ansible_distribution_major_version' from source: facts 34886 1727204518.34138: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204518.34304: variable 'type' from source: play vars 34886 1727204518.34310: variable 'state' from source: include params 34886 1727204518.34315: variable 'interface' from source: play vars 34886 1727204518.34323: variable 'current_interfaces' from source: set_fact 34886 1727204518.34330: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 34886 1727204518.34333: when evaluation is False, skipping this task 34886 1727204518.34336: _execute() done 34886 1727204518.34341: dumping result to json 34886 1727204518.34351: done dumping result, returning 34886 1727204518.34357: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface veth0 [12b410aa-8751-04b9-2e74-0000000005d4] 34886 1727204518.34361: sending task result for task 12b410aa-8751-04b9-2e74-0000000005d4 34886 1727204518.34453: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005d4 34886 1727204518.34457: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204518.34521: no more pending results, returning what we have 34886 1727204518.34526: results queue empty 34886 1727204518.34527: checking for any_errors_fatal 34886 1727204518.34536: done checking for any_errors_fatal 34886 1727204518.34537: checking for max_fail_percentage 34886 1727204518.34538: done checking for max_fail_percentage 34886 1727204518.34540: checking to see if all hosts have failed and the running result is not ok 34886 1727204518.34541: done checking to see if all hosts have failed 34886 1727204518.34542: getting the remaining hosts for this loop 34886 1727204518.34543: done getting the remaining hosts for this loop 34886 1727204518.34547: getting the next task for host managed-node3 34886 1727204518.34553: done getting next task for host managed-node3 34886 1727204518.34557: ^ task is: TASK: Create tap interface {{ interface }} 34886 1727204518.34560: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204518.34564: getting variables 34886 1727204518.34565: in VariableManager get_vars() 34886 1727204518.34613: Calling all_inventory to load vars for managed-node3 34886 1727204518.34616: Calling groups_inventory to load vars for managed-node3 34886 1727204518.34621: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204518.34633: Calling all_plugins_play to load vars for managed-node3 34886 1727204518.34636: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204518.34640: Calling groups_plugins_play to load vars for managed-node3 34886 1727204518.35876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204518.37480: done with get_vars() 34886 1727204518.37506: done getting variables 34886 1727204518.37561: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204518.37657: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.046) 0:00:36.544 ***** 34886 1727204518.37682: entering _queue_task() for managed-node3/command 34886 1727204518.37949: worker is 1 (out of 1 available) 34886 1727204518.37963: exiting _queue_task() for managed-node3/command 34886 1727204518.37977: done queuing things up, now waiting for results queue to drain 34886 1727204518.37979: waiting for pending results... 34886 1727204518.38162: running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 34886 1727204518.38254: in run() - task 12b410aa-8751-04b9-2e74-0000000005d5 34886 1727204518.38267: variable 'ansible_search_path' from source: unknown 34886 1727204518.38271: variable 'ansible_search_path' from source: unknown 34886 1727204518.38305: calling self._execute() 34886 1727204518.38393: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204518.38400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204518.38410: variable 'omit' from source: magic vars 34886 1727204518.38868: variable 'ansible_distribution_major_version' from source: facts 34886 1727204518.38872: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204518.39073: variable 'type' from source: play vars 34886 1727204518.39080: variable 'state' from source: include params 34886 1727204518.39086: variable 'interface' from source: play vars 34886 1727204518.39092: variable 'current_interfaces' from source: set_fact 34886 1727204518.39104: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 34886 1727204518.39107: when evaluation is False, skipping this task 34886 1727204518.39110: _execute() done 34886 1727204518.39115: dumping result to json 34886 1727204518.39118: done dumping result, returning 34886 1727204518.39126: done running TaskExecutor() for managed-node3/TASK: Create tap interface veth0 [12b410aa-8751-04b9-2e74-0000000005d5] 34886 1727204518.39134: sending task result for task 12b410aa-8751-04b9-2e74-0000000005d5 34886 1727204518.39241: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005d5 34886 1727204518.39244: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204518.39454: no more pending results, returning what we have 34886 1727204518.39457: results queue empty 34886 1727204518.39459: checking for any_errors_fatal 34886 1727204518.39464: done checking for any_errors_fatal 34886 1727204518.39465: checking for max_fail_percentage 34886 1727204518.39467: done checking for max_fail_percentage 34886 1727204518.39468: checking to see if all hosts have failed and the running result is not ok 34886 1727204518.39469: done checking to see if all hosts have failed 34886 1727204518.39470: getting the remaining hosts for this loop 34886 1727204518.39471: done getting the remaining hosts for this loop 34886 1727204518.39475: getting the next task for host managed-node3 34886 1727204518.39481: done getting next task for host managed-node3 34886 1727204518.39484: ^ task is: TASK: Delete tap interface {{ interface }} 34886 1727204518.39487: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204518.39494: getting variables 34886 1727204518.39495: in VariableManager get_vars() 34886 1727204518.39533: Calling all_inventory to load vars for managed-node3 34886 1727204518.39536: Calling groups_inventory to load vars for managed-node3 34886 1727204518.39539: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204518.39552: Calling all_plugins_play to load vars for managed-node3 34886 1727204518.39556: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204518.39560: Calling groups_plugins_play to load vars for managed-node3 34886 1727204518.41019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204518.43097: done with get_vars() 34886 1727204518.43131: done getting variables 34886 1727204518.43202: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34886 1727204518.43324: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.056) 0:00:36.601 ***** 34886 1727204518.43358: entering _queue_task() for managed-node3/command 34886 1727204518.43673: worker is 1 (out of 1 available) 34886 1727204518.43688: exiting _queue_task() for managed-node3/command 34886 1727204518.43704: done queuing things up, now waiting for results queue to drain 34886 1727204518.43706: waiting for pending results... 34886 1727204518.44116: running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 34886 1727204518.44125: in run() - task 12b410aa-8751-04b9-2e74-0000000005d6 34886 1727204518.44141: variable 'ansible_search_path' from source: unknown 34886 1727204518.44144: variable 'ansible_search_path' from source: unknown 34886 1727204518.44229: calling self._execute() 34886 1727204518.44288: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204518.44298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204518.44310: variable 'omit' from source: magic vars 34886 1727204518.44733: variable 'ansible_distribution_major_version' from source: facts 34886 1727204518.44746: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204518.45097: variable 'type' from source: play vars 34886 1727204518.45101: variable 'state' from source: include params 34886 1727204518.45105: variable 'interface' from source: play vars 34886 1727204518.45109: variable 'current_interfaces' from source: set_fact 34886 1727204518.45116: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 34886 1727204518.45122: when evaluation is False, skipping this task 34886 1727204518.45126: _execute() done 34886 1727204518.45129: dumping result to json 34886 1727204518.45132: done dumping result, returning 34886 1727204518.45135: done running TaskExecutor() for managed-node3/TASK: Delete tap interface veth0 [12b410aa-8751-04b9-2e74-0000000005d6] 34886 1727204518.45138: sending task result for task 12b410aa-8751-04b9-2e74-0000000005d6 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34886 1727204518.45362: no more pending results, returning what we have 34886 1727204518.45365: results queue empty 34886 1727204518.45367: checking for any_errors_fatal 34886 1727204518.45373: done checking for any_errors_fatal 34886 1727204518.45374: checking for max_fail_percentage 34886 1727204518.45376: done checking for max_fail_percentage 34886 1727204518.45377: checking to see if all hosts have failed and the running result is not ok 34886 1727204518.45378: done checking to see if all hosts have failed 34886 1727204518.45379: getting the remaining hosts for this loop 34886 1727204518.45380: done getting the remaining hosts for this loop 34886 1727204518.45384: getting the next task for host managed-node3 34886 1727204518.45394: done getting next task for host managed-node3 34886 1727204518.45397: ^ task is: TASK: Clean up namespace 34886 1727204518.45400: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204518.45403: getting variables 34886 1727204518.45405: in VariableManager get_vars() 34886 1727204518.45444: Calling all_inventory to load vars for managed-node3 34886 1727204518.45448: Calling groups_inventory to load vars for managed-node3 34886 1727204518.45451: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204518.45463: Calling all_plugins_play to load vars for managed-node3 34886 1727204518.45467: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204518.45472: Calling groups_plugins_play to load vars for managed-node3 34886 1727204518.46002: done sending task result for task 12b410aa-8751-04b9-2e74-0000000005d6 34886 1727204518.46005: WORKER PROCESS EXITING 34886 1727204518.47636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204518.50699: done with get_vars() 34886 1727204518.50734: done getting variables 34886 1727204518.50806: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.074) 0:00:36.676 ***** 34886 1727204518.50839: entering _queue_task() for managed-node3/command 34886 1727204518.51177: worker is 1 (out of 1 available) 34886 1727204518.51395: exiting _queue_task() for managed-node3/command 34886 1727204518.51408: done queuing things up, now waiting for results queue to drain 34886 1727204518.51410: waiting for pending results... 34886 1727204518.51541: running TaskExecutor() for managed-node3/TASK: Clean up namespace 34886 1727204518.51626: in run() - task 12b410aa-8751-04b9-2e74-0000000000b4 34886 1727204518.51796: variable 'ansible_search_path' from source: unknown 34886 1727204518.51799: calling self._execute() 34886 1727204518.51802: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204518.51816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204518.51833: variable 'omit' from source: magic vars 34886 1727204518.52266: variable 'ansible_distribution_major_version' from source: facts 34886 1727204518.52285: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204518.52298: variable 'omit' from source: magic vars 34886 1727204518.52326: variable 'omit' from source: magic vars 34886 1727204518.52381: variable 'omit' from source: magic vars 34886 1727204518.52432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204518.52481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204518.52511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204518.52538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204518.52556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204518.52601: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204518.52611: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204518.52619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204518.52753: Set connection var ansible_timeout to 10 34886 1727204518.52767: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204518.52775: Set connection var ansible_connection to ssh 34886 1727204518.52894: Set connection var ansible_shell_executable to /bin/sh 34886 1727204518.52897: Set connection var ansible_pipelining to False 34886 1727204518.52902: Set connection var ansible_shell_type to sh 34886 1727204518.52904: variable 'ansible_shell_executable' from source: unknown 34886 1727204518.52906: variable 'ansible_connection' from source: unknown 34886 1727204518.52908: variable 'ansible_module_compression' from source: unknown 34886 1727204518.52910: variable 'ansible_shell_type' from source: unknown 34886 1727204518.52913: variable 'ansible_shell_executable' from source: unknown 34886 1727204518.52915: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204518.52917: variable 'ansible_pipelining' from source: unknown 34886 1727204518.52919: variable 'ansible_timeout' from source: unknown 34886 1727204518.52921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204518.53062: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204518.53081: variable 'omit' from source: magic vars 34886 1727204518.53095: starting attempt loop 34886 1727204518.53103: running the handler 34886 1727204518.53130: _low_level_execute_command(): starting 34886 1727204518.53142: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204518.54002: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.54022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204518.54040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204518.54064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204518.54136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.55896: stdout chunk (state=3): >>>/root <<< 34886 1727204518.56100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204518.56103: stdout chunk (state=3): >>><<< 34886 1727204518.56106: stderr chunk (state=3): >>><<< 34886 1727204518.56127: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204518.56157: _low_level_execute_command(): starting 34886 1727204518.56254: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046 `" && echo ansible-tmp-1727204518.5614212-36427-102661657042046="` echo /root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046 `" ) && sleep 0' 34886 1727204518.56809: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.56819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34886 1727204518.56911: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.56932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204518.56941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204518.56950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204518.57017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.59035: stdout chunk (state=3): >>>ansible-tmp-1727204518.5614212-36427-102661657042046=/root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046 <<< 34886 1727204518.59208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204518.59234: stdout chunk (state=3): >>><<< 34886 1727204518.59238: stderr chunk (state=3): >>><<< 34886 1727204518.59268: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204518.5614212-36427-102661657042046=/root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204518.59395: variable 'ansible_module_compression' from source: unknown 34886 1727204518.59398: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204518.59427: variable 'ansible_facts' from source: unknown 34886 1727204518.59539: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/AnsiballZ_command.py 34886 1727204518.59760: Sending initial data 34886 1727204518.59764: Sent initial data (156 bytes) 34886 1727204518.60357: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204518.60373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 34886 1727204518.60387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.60441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204518.60458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204518.60495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.62113: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204518.62148: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204518.62223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp5gbmyq33 /root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/AnsiballZ_command.py <<< 34886 1727204518.62228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/AnsiballZ_command.py" <<< 34886 1727204518.62231: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp5gbmyq33" to remote "/root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/AnsiballZ_command.py" <<< 34886 1727204518.63255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204518.63312: stderr chunk (state=3): >>><<< 34886 1727204518.63316: stdout chunk (state=3): >>><<< 34886 1727204518.63337: done transferring module to remote 34886 1727204518.63353: _low_level_execute_command(): starting 34886 1727204518.63356: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/ /root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/AnsiballZ_command.py && sleep 0' 34886 1727204518.63778: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204518.63783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.63786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204518.63788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 34886 1727204518.63793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.63848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204518.63850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204518.63884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.65810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204518.65813: stdout chunk (state=3): >>><<< 34886 1727204518.65822: stderr chunk (state=3): >>><<< 34886 1727204518.65936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204518.65940: _low_level_execute_command(): starting 34886 1727204518.65943: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/AnsiballZ_command.py && sleep 0' 34886 1727204518.66404: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204518.66408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.66411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204518.66413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.66459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204518.66466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204518.66512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.84503: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-24 15:01:58.838568", "end": "2024-09-24 15:01:58.843837", "delta": "0:00:00.005269", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204518.86146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204518.86211: stderr chunk (state=3): >>><<< 34886 1727204518.86215: stdout chunk (state=3): >>><<< 34886 1727204518.86235: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-24 15:01:58.838568", "end": "2024-09-24 15:01:58.843837", "delta": "0:00:00.005269", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204518.86272: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204518.86280: _low_level_execute_command(): starting 34886 1727204518.86286: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204518.5614212-36427-102661657042046/ > /dev/null 2>&1 && sleep 0' 34886 1727204518.86762: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204518.86801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34886 1727204518.86805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204518.86808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.86810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204518.86812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204518.86856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204518.86873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204518.86916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204518.88836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204518.88896: stderr chunk (state=3): >>><<< 34886 1727204518.88899: stdout chunk (state=3): >>><<< 34886 1727204518.88916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204518.88924: handler run complete 34886 1727204518.88946: Evaluated conditional (False): False 34886 1727204518.88960: attempt loop complete, returning result 34886 1727204518.88963: _execute() done 34886 1727204518.88966: dumping result to json 34886 1727204518.88972: done dumping result, returning 34886 1727204518.88981: done running TaskExecutor() for managed-node3/TASK: Clean up namespace [12b410aa-8751-04b9-2e74-0000000000b4] 34886 1727204518.88993: sending task result for task 12b410aa-8751-04b9-2e74-0000000000b4 34886 1727204518.89102: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000b4 34886 1727204518.89105: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.005269", "end": "2024-09-24 15:01:58.843837", "rc": 0, "start": "2024-09-24 15:01:58.838568" } 34886 1727204518.89197: no more pending results, returning what we have 34886 1727204518.89201: results queue empty 34886 1727204518.89202: checking for any_errors_fatal 34886 1727204518.89208: done checking for any_errors_fatal 34886 1727204518.89209: checking for max_fail_percentage 34886 1727204518.89211: done checking for max_fail_percentage 34886 1727204518.89212: checking to see if all hosts have failed and the running result is not ok 34886 1727204518.89213: done checking to see if all hosts have failed 34886 1727204518.89223: getting the remaining hosts for this loop 34886 1727204518.89226: done getting the remaining hosts for this loop 34886 1727204518.89231: getting the next task for host managed-node3 34886 1727204518.89236: done getting next task for host managed-node3 34886 1727204518.89240: ^ task is: TASK: Verify network state restored to default 34886 1727204518.89242: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204518.89245: getting variables 34886 1727204518.89247: in VariableManager get_vars() 34886 1727204518.89293: Calling all_inventory to load vars for managed-node3 34886 1727204518.89296: Calling groups_inventory to load vars for managed-node3 34886 1727204518.89299: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204518.89311: Calling all_plugins_play to load vars for managed-node3 34886 1727204518.89314: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204518.89317: Calling groups_plugins_play to load vars for managed-node3 34886 1727204518.94279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204518.96445: done with get_vars() 34886 1727204518.96468: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Tuesday 24 September 2024 15:01:58 -0400 (0:00:00.456) 0:00:37.133 ***** 34886 1727204518.96537: entering _queue_task() for managed-node3/include_tasks 34886 1727204518.96815: worker is 1 (out of 1 available) 34886 1727204518.96832: exiting _queue_task() for managed-node3/include_tasks 34886 1727204518.96845: done queuing things up, now waiting for results queue to drain 34886 1727204518.96848: waiting for pending results... 34886 1727204518.97040: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 34886 1727204518.97114: in run() - task 12b410aa-8751-04b9-2e74-0000000000b5 34886 1727204518.97129: variable 'ansible_search_path' from source: unknown 34886 1727204518.97164: calling self._execute() 34886 1727204518.97255: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204518.97260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204518.97271: variable 'omit' from source: magic vars 34886 1727204518.97592: variable 'ansible_distribution_major_version' from source: facts 34886 1727204518.97604: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204518.97610: _execute() done 34886 1727204518.97615: dumping result to json 34886 1727204518.97625: done dumping result, returning 34886 1727204518.97628: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [12b410aa-8751-04b9-2e74-0000000000b5] 34886 1727204518.97633: sending task result for task 12b410aa-8751-04b9-2e74-0000000000b5 34886 1727204518.97738: done sending task result for task 12b410aa-8751-04b9-2e74-0000000000b5 34886 1727204518.97743: WORKER PROCESS EXITING 34886 1727204518.97810: no more pending results, returning what we have 34886 1727204518.97817: in VariableManager get_vars() 34886 1727204518.97873: Calling all_inventory to load vars for managed-node3 34886 1727204518.97877: Calling groups_inventory to load vars for managed-node3 34886 1727204518.97880: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204518.97898: Calling all_plugins_play to load vars for managed-node3 34886 1727204518.97909: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204518.97914: Calling groups_plugins_play to load vars for managed-node3 34886 1727204519.00150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204519.03297: done with get_vars() 34886 1727204519.03332: variable 'ansible_search_path' from source: unknown 34886 1727204519.03353: we have included files to process 34886 1727204519.03355: generating all_blocks data 34886 1727204519.03357: done generating all_blocks data 34886 1727204519.03363: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34886 1727204519.03365: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34886 1727204519.03367: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34886 1727204519.03918: done processing included file 34886 1727204519.03923: iterating over new_blocks loaded from include file 34886 1727204519.03925: in VariableManager get_vars() 34886 1727204519.03948: done with get_vars() 34886 1727204519.03949: filtering new block on tags 34886 1727204519.03975: done filtering new block on tags 34886 1727204519.03978: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 34886 1727204519.03983: extending task lists for all hosts with included blocks 34886 1727204519.07954: done extending task lists 34886 1727204519.07956: done processing included files 34886 1727204519.07957: results queue empty 34886 1727204519.07958: checking for any_errors_fatal 34886 1727204519.07965: done checking for any_errors_fatal 34886 1727204519.07966: checking for max_fail_percentage 34886 1727204519.07967: done checking for max_fail_percentage 34886 1727204519.07968: checking to see if all hosts have failed and the running result is not ok 34886 1727204519.07969: done checking to see if all hosts have failed 34886 1727204519.07970: getting the remaining hosts for this loop 34886 1727204519.07971: done getting the remaining hosts for this loop 34886 1727204519.07974: getting the next task for host managed-node3 34886 1727204519.07979: done getting next task for host managed-node3 34886 1727204519.07981: ^ task is: TASK: Check routes and DNS 34886 1727204519.07984: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204519.07986: getting variables 34886 1727204519.07988: in VariableManager get_vars() 34886 1727204519.08008: Calling all_inventory to load vars for managed-node3 34886 1727204519.08011: Calling groups_inventory to load vars for managed-node3 34886 1727204519.08013: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204519.08023: Calling all_plugins_play to load vars for managed-node3 34886 1727204519.08030: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204519.08035: Calling groups_plugins_play to load vars for managed-node3 34886 1727204519.10153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204519.12492: done with get_vars() 34886 1727204519.12518: done getting variables 34886 1727204519.12559: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.160) 0:00:37.293 ***** 34886 1727204519.12583: entering _queue_task() for managed-node3/shell 34886 1727204519.12876: worker is 1 (out of 1 available) 34886 1727204519.12893: exiting _queue_task() for managed-node3/shell 34886 1727204519.12906: done queuing things up, now waiting for results queue to drain 34886 1727204519.12908: waiting for pending results... 34886 1727204519.13100: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 34886 1727204519.13178: in run() - task 12b410aa-8751-04b9-2e74-00000000075e 34886 1727204519.13193: variable 'ansible_search_path' from source: unknown 34886 1727204519.13196: variable 'ansible_search_path' from source: unknown 34886 1727204519.13235: calling self._execute() 34886 1727204519.13495: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204519.13499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204519.13503: variable 'omit' from source: magic vars 34886 1727204519.13900: variable 'ansible_distribution_major_version' from source: facts 34886 1727204519.13924: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204519.13937: variable 'omit' from source: magic vars 34886 1727204519.14000: variable 'omit' from source: magic vars 34886 1727204519.14060: variable 'omit' from source: magic vars 34886 1727204519.14116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34886 1727204519.14196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34886 1727204519.14208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34886 1727204519.14242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204519.14278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34886 1727204519.14311: variable 'inventory_hostname' from source: host vars for 'managed-node3' 34886 1727204519.14387: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204519.14392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204519.14482: Set connection var ansible_timeout to 10 34886 1727204519.14488: Set connection var ansible_module_compression to ZIP_DEFLATED 34886 1727204519.14499: Set connection var ansible_connection to ssh 34886 1727204519.14509: Set connection var ansible_shell_executable to /bin/sh 34886 1727204519.14519: Set connection var ansible_pipelining to False 34886 1727204519.14524: Set connection var ansible_shell_type to sh 34886 1727204519.14547: variable 'ansible_shell_executable' from source: unknown 34886 1727204519.14551: variable 'ansible_connection' from source: unknown 34886 1727204519.14554: variable 'ansible_module_compression' from source: unknown 34886 1727204519.14556: variable 'ansible_shell_type' from source: unknown 34886 1727204519.14561: variable 'ansible_shell_executable' from source: unknown 34886 1727204519.14564: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204519.14570: variable 'ansible_pipelining' from source: unknown 34886 1727204519.14573: variable 'ansible_timeout' from source: unknown 34886 1727204519.14579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204519.14714: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204519.14723: variable 'omit' from source: magic vars 34886 1727204519.14729: starting attempt loop 34886 1727204519.14733: running the handler 34886 1727204519.14744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34886 1727204519.14763: _low_level_execute_command(): starting 34886 1727204519.14770: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34886 1727204519.15274: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204519.15312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204519.15315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204519.15318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204519.15364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204519.15368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204519.15421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204519.17194: stdout chunk (state=3): >>>/root <<< 34886 1727204519.17459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204519.17462: stdout chunk (state=3): >>><<< 34886 1727204519.17465: stderr chunk (state=3): >>><<< 34886 1727204519.17471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204519.17474: _low_level_execute_command(): starting 34886 1727204519.17477: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676 `" && echo ansible-tmp-1727204519.1740057-36453-211426506841676="` echo /root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676 `" ) && sleep 0' 34886 1727204519.18027: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204519.18056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204519.18069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204519.18116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204519.18133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204519.18178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204519.20176: stdout chunk (state=3): >>>ansible-tmp-1727204519.1740057-36453-211426506841676=/root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676 <<< 34886 1727204519.20323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204519.20385: stderr chunk (state=3): >>><<< 34886 1727204519.20392: stdout chunk (state=3): >>><<< 34886 1727204519.20411: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204519.1740057-36453-211426506841676=/root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204519.20468: variable 'ansible_module_compression' from source: unknown 34886 1727204519.20595: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34886n8odqq6w/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34886 1727204519.20598: variable 'ansible_facts' from source: unknown 34886 1727204519.20775: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/AnsiballZ_command.py 34886 1727204519.20808: Sending initial data 34886 1727204519.20822: Sent initial data (156 bytes) 34886 1727204519.21387: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204519.21466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204519.21547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204519.21552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204519.21598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204519.23206: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34886 1727204519.23228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34886 1727204519.23273: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp_4x9gzee /root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/AnsiballZ_command.py <<< 34886 1727204519.23275: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/AnsiballZ_command.py" <<< 34886 1727204519.23308: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 34886 1727204519.23312: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-34886n8odqq6w/tmp_4x9gzee" to remote "/root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/AnsiballZ_command.py" <<< 34886 1727204519.24442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204519.24446: stderr chunk (state=3): >>><<< 34886 1727204519.24449: stdout chunk (state=3): >>><<< 34886 1727204519.24451: done transferring module to remote 34886 1727204519.24453: _low_level_execute_command(): starting 34886 1727204519.24456: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/ /root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/AnsiballZ_command.py && sleep 0' 34886 1727204519.24905: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 34886 1727204519.24920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204519.24934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34886 1727204519.24945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204519.24995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204519.25017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204519.25044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204519.26867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204519.26925: stderr chunk (state=3): >>><<< 34886 1727204519.26929: stdout chunk (state=3): >>><<< 34886 1727204519.26942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204519.26946: _low_level_execute_command(): starting 34886 1727204519.26952: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/AnsiballZ_command.py && sleep 0' 34886 1727204519.27407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204519.27411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 34886 1727204519.27413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34886 1727204519.27415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204519.27473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204519.27478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204519.27519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204519.45568: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2607sec preferred_lft 2607sec\n inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:01:59.445372", "end": "2024-09-24 15:01:59.454237", "delta": "0:00:00.008865", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34886 1727204519.47256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 34886 1727204519.47322: stderr chunk (state=3): >>><<< 34886 1727204519.47327: stdout chunk (state=3): >>><<< 34886 1727204519.47343: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2607sec preferred_lft 2607sec\n inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:01:59.445372", "end": "2024-09-24 15:01:59.454237", "delta": "0:00:00.008865", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 34886 1727204519.47400: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34886 1727204519.47409: _low_level_execute_command(): starting 34886 1727204519.47415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204519.1740057-36453-211426506841676/ > /dev/null 2>&1 && sleep 0' 34886 1727204519.48076: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34886 1727204519.48080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 34886 1727204519.48104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34886 1727204519.48110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34886 1727204519.48175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34886 1727204519.50070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34886 1727204519.50122: stderr chunk (state=3): >>><<< 34886 1727204519.50128: stdout chunk (state=3): >>><<< 34886 1727204519.50143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34886 1727204519.50152: handler run complete 34886 1727204519.50174: Evaluated conditional (False): False 34886 1727204519.50191: attempt loop complete, returning result 34886 1727204519.50194: _execute() done 34886 1727204519.50197: dumping result to json 34886 1727204519.50205: done dumping result, returning 34886 1727204519.50215: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [12b410aa-8751-04b9-2e74-00000000075e] 34886 1727204519.50224: sending task result for task 12b410aa-8751-04b9-2e74-00000000075e 34886 1727204519.50346: done sending task result for task 12b410aa-8751-04b9-2e74-00000000075e 34886 1727204519.50348: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008865", "end": "2024-09-24 15:01:59.454237", "rc": 0, "start": "2024-09-24 15:01:59.445372" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2607sec preferred_lft 2607sec inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 34886 1727204519.50444: no more pending results, returning what we have 34886 1727204519.50448: results queue empty 34886 1727204519.50449: checking for any_errors_fatal 34886 1727204519.50451: done checking for any_errors_fatal 34886 1727204519.50452: checking for max_fail_percentage 34886 1727204519.50453: done checking for max_fail_percentage 34886 1727204519.50455: checking to see if all hosts have failed and the running result is not ok 34886 1727204519.50456: done checking to see if all hosts have failed 34886 1727204519.50457: getting the remaining hosts for this loop 34886 1727204519.50465: done getting the remaining hosts for this loop 34886 1727204519.50470: getting the next task for host managed-node3 34886 1727204519.50476: done getting next task for host managed-node3 34886 1727204519.50479: ^ task is: TASK: Verify DNS and network connectivity 34886 1727204519.50482: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34886 1727204519.50493: getting variables 34886 1727204519.50495: in VariableManager get_vars() 34886 1727204519.50537: Calling all_inventory to load vars for managed-node3 34886 1727204519.50540: Calling groups_inventory to load vars for managed-node3 34886 1727204519.50543: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204519.50554: Calling all_plugins_play to load vars for managed-node3 34886 1727204519.50558: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204519.50561: Calling groups_plugins_play to load vars for managed-node3 34886 1727204519.52743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204519.55829: done with get_vars() 34886 1727204519.55867: done getting variables 34886 1727204519.55943: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.433) 0:00:37.727 ***** 34886 1727204519.55984: entering _queue_task() for managed-node3/shell 34886 1727204519.56529: worker is 1 (out of 1 available) 34886 1727204519.56541: exiting _queue_task() for managed-node3/shell 34886 1727204519.56551: done queuing things up, now waiting for results queue to drain 34886 1727204519.56553: waiting for pending results... 34886 1727204519.56906: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 34886 1727204519.56912: in run() - task 12b410aa-8751-04b9-2e74-00000000075f 34886 1727204519.56915: variable 'ansible_search_path' from source: unknown 34886 1727204519.56917: variable 'ansible_search_path' from source: unknown 34886 1727204519.56921: calling self._execute() 34886 1727204519.57036: variable 'ansible_host' from source: host vars for 'managed-node3' 34886 1727204519.57050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 34886 1727204519.57067: variable 'omit' from source: magic vars 34886 1727204519.57513: variable 'ansible_distribution_major_version' from source: facts 34886 1727204519.57531: Evaluated conditional (ansible_distribution_major_version != '6'): True 34886 1727204519.57717: variable 'ansible_facts' from source: unknown 34886 1727204519.58974: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 34886 1727204519.58985: when evaluation is False, skipping this task 34886 1727204519.58996: _execute() done 34886 1727204519.59005: dumping result to json 34886 1727204519.59014: done dumping result, returning 34886 1727204519.59026: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [12b410aa-8751-04b9-2e74-00000000075f] 34886 1727204519.59038: sending task result for task 12b410aa-8751-04b9-2e74-00000000075f skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 34886 1727204519.59351: no more pending results, returning what we have 34886 1727204519.59357: results queue empty 34886 1727204519.59358: checking for any_errors_fatal 34886 1727204519.59377: done checking for any_errors_fatal 34886 1727204519.59378: checking for max_fail_percentage 34886 1727204519.59380: done checking for max_fail_percentage 34886 1727204519.59381: checking to see if all hosts have failed and the running result is not ok 34886 1727204519.59383: done checking to see if all hosts have failed 34886 1727204519.59383: getting the remaining hosts for this loop 34886 1727204519.59385: done getting the remaining hosts for this loop 34886 1727204519.59393: getting the next task for host managed-node3 34886 1727204519.59403: done getting next task for host managed-node3 34886 1727204519.59406: ^ task is: TASK: meta (flush_handlers) 34886 1727204519.59408: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204519.59414: getting variables 34886 1727204519.59416: in VariableManager get_vars() 34886 1727204519.59469: Calling all_inventory to load vars for managed-node3 34886 1727204519.59473: Calling groups_inventory to load vars for managed-node3 34886 1727204519.59477: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204519.59609: Calling all_plugins_play to load vars for managed-node3 34886 1727204519.59615: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204519.59622: done sending task result for task 12b410aa-8751-04b9-2e74-00000000075f 34886 1727204519.59625: WORKER PROCESS EXITING 34886 1727204519.59630: Calling groups_plugins_play to load vars for managed-node3 34886 1727204519.62132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204519.65059: done with get_vars() 34886 1727204519.65096: done getting variables 34886 1727204519.65181: in VariableManager get_vars() 34886 1727204519.65201: Calling all_inventory to load vars for managed-node3 34886 1727204519.65204: Calling groups_inventory to load vars for managed-node3 34886 1727204519.65207: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204519.65212: Calling all_plugins_play to load vars for managed-node3 34886 1727204519.65215: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204519.65218: Calling groups_plugins_play to load vars for managed-node3 34886 1727204519.67250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204519.70459: done with get_vars() 34886 1727204519.70506: done queuing things up, now waiting for results queue to drain 34886 1727204519.70509: results queue empty 34886 1727204519.70510: checking for any_errors_fatal 34886 1727204519.70514: done checking for any_errors_fatal 34886 1727204519.70515: checking for max_fail_percentage 34886 1727204519.70516: done checking for max_fail_percentage 34886 1727204519.70517: checking to see if all hosts have failed and the running result is not ok 34886 1727204519.70518: done checking to see if all hosts have failed 34886 1727204519.70519: getting the remaining hosts for this loop 34886 1727204519.70520: done getting the remaining hosts for this loop 34886 1727204519.70524: getting the next task for host managed-node3 34886 1727204519.70529: done getting next task for host managed-node3 34886 1727204519.70531: ^ task is: TASK: meta (flush_handlers) 34886 1727204519.70532: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204519.70536: getting variables 34886 1727204519.70537: in VariableManager get_vars() 34886 1727204519.70554: Calling all_inventory to load vars for managed-node3 34886 1727204519.70557: Calling groups_inventory to load vars for managed-node3 34886 1727204519.70560: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204519.70567: Calling all_plugins_play to load vars for managed-node3 34886 1727204519.70570: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204519.70574: Calling groups_plugins_play to load vars for managed-node3 34886 1727204519.72545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204519.75532: done with get_vars() 34886 1727204519.75571: done getting variables 34886 1727204519.75643: in VariableManager get_vars() 34886 1727204519.75664: Calling all_inventory to load vars for managed-node3 34886 1727204519.75667: Calling groups_inventory to load vars for managed-node3 34886 1727204519.75670: Calling all_plugins_inventory to load vars for managed-node3 34886 1727204519.75676: Calling all_plugins_play to load vars for managed-node3 34886 1727204519.75679: Calling groups_plugins_inventory to load vars for managed-node3 34886 1727204519.75683: Calling groups_plugins_play to load vars for managed-node3 34886 1727204519.77759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34886 1727204519.80799: done with get_vars() 34886 1727204519.80848: done queuing things up, now waiting for results queue to drain 34886 1727204519.80851: results queue empty 34886 1727204519.80852: checking for any_errors_fatal 34886 1727204519.80854: done checking for any_errors_fatal 34886 1727204519.80855: checking for max_fail_percentage 34886 1727204519.80856: done checking for max_fail_percentage 34886 1727204519.80857: checking to see if all hosts have failed and the running result is not ok 34886 1727204519.80858: done checking to see if all hosts have failed 34886 1727204519.80859: getting the remaining hosts for this loop 34886 1727204519.80861: done getting the remaining hosts for this loop 34886 1727204519.80876: getting the next task for host managed-node3 34886 1727204519.80881: done getting next task for host managed-node3 34886 1727204519.80883: ^ task is: None 34886 1727204519.80885: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34886 1727204519.80886: done queuing things up, now waiting for results queue to drain 34886 1727204519.80888: results queue empty 34886 1727204519.80891: checking for any_errors_fatal 34886 1727204519.80892: done checking for any_errors_fatal 34886 1727204519.80893: checking for max_fail_percentage 34886 1727204519.80894: done checking for max_fail_percentage 34886 1727204519.80895: checking to see if all hosts have failed and the running result is not ok 34886 1727204519.80896: done checking to see if all hosts have failed 34886 1727204519.80898: getting the next task for host managed-node3 34886 1727204519.80902: done getting next task for host managed-node3 34886 1727204519.80903: ^ task is: None 34886 1727204519.80905: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=75 changed=2 unreachable=0 failed=0 skipped=63 rescued=0 ignored=0 Tuesday 24 September 2024 15:01:59 -0400 (0:00:00.250) 0:00:37.977 ***** =============================================================================== fedora.linux_system_roles.network : Configure networking connection profiles --- 2.87s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which services are running ---- 2.31s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.15s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 2.13s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 1.86s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Ensure ping6 command is present ----------------------------------------- 1.79s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Gathering Facts --------------------------------------------------------- 1.50s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.50s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.22s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Create veth interface veth0 --------------------------------------------- 1.19s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.00s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.90s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.85s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.83s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.69s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.67s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather current interface info ------------------------------------------- 0.66s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.61s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.53s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather current interface info ------------------------------------------- 0.53s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 34886 1727204519.81058: RUNNING CLEANUP