[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 25039 1726867441.84768: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 25039 1726867441.85841: Added group all to inventory 25039 1726867441.85843: Added group ungrouped to inventory 25039 1726867441.85847: Group all now contains ungrouped 25039 1726867441.85850: Examining possible inventory source: /tmp/network-5rw/inventory.yml 25039 1726867442.16797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 25039 1726867442.16858: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 25039 1726867442.17084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 25039 1726867442.17143: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 25039 1726867442.17220: Loaded config def from plugin (inventory/script) 25039 1726867442.17222: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 25039 1726867442.17263: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 25039 1726867442.17553: Loaded config def from plugin (inventory/yaml) 25039 1726867442.17555: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 25039 1726867442.17643: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 25039 1726867442.18688: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 25039 1726867442.18692: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 25039 1726867442.18695: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 25039 1726867442.18701: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 25039 1726867442.18706: Loading data from /tmp/network-5rw/inventory.yml 25039 1726867442.18775: /tmp/network-5rw/inventory.yml was not parsable by auto 25039 1726867442.18842: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 25039 1726867442.19086: Loading data from /tmp/network-5rw/inventory.yml 25039 1726867442.19172: group all already in inventory 25039 1726867442.19182: set inventory_file for managed_node1 25039 1726867442.19186: set inventory_dir for managed_node1 25039 1726867442.19187: Added host managed_node1 to inventory 25039 1726867442.19190: Added host managed_node1 to group all 25039 1726867442.19191: set ansible_host for managed_node1 25039 1726867442.19192: set ansible_ssh_extra_args for managed_node1 25039 1726867442.19195: set inventory_file for managed_node2 25039 1726867442.19198: set inventory_dir for managed_node2 25039 1726867442.19199: Added host managed_node2 to inventory 25039 1726867442.19201: Added host managed_node2 to group all 25039 1726867442.19202: set ansible_host for managed_node2 25039 1726867442.19202: set ansible_ssh_extra_args for managed_node2 25039 1726867442.19205: set inventory_file for managed_node3 25039 1726867442.19207: set inventory_dir for managed_node3 25039 1726867442.19208: Added host managed_node3 to inventory 25039 1726867442.19209: Added host managed_node3 to group all 25039 1726867442.19210: set ansible_host for managed_node3 25039 1726867442.19211: set ansible_ssh_extra_args for managed_node3 25039 1726867442.19214: Reconcile groups and hosts in inventory. 25039 1726867442.19217: Group ungrouped now contains managed_node1 25039 1726867442.19219: Group ungrouped now contains managed_node2 25039 1726867442.19221: Group ungrouped now contains managed_node3 25039 1726867442.19504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 25039 1726867442.19630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 25039 1726867442.19830: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 25039 1726867442.19858: Loaded config def from plugin (vars/host_group_vars) 25039 1726867442.19860: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 25039 1726867442.19867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 25039 1726867442.19874: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 25039 1726867442.19916: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 25039 1726867442.20751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867442.20848: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 25039 1726867442.21095: Loaded config def from plugin (connection/local) 25039 1726867442.21098: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 25039 1726867442.22369: Loaded config def from plugin (connection/paramiko_ssh) 25039 1726867442.22372: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 25039 1726867442.23956: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25039 1726867442.24032: Loaded config def from plugin (connection/psrp) 25039 1726867442.24036: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 25039 1726867442.24736: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25039 1726867442.24774: Loaded config def from plugin (connection/ssh) 25039 1726867442.24776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 25039 1726867442.27383: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25039 1726867442.27421: Loaded config def from plugin (connection/winrm) 25039 1726867442.27424: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 25039 1726867442.27454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 25039 1726867442.27536: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 25039 1726867442.27605: Loaded config def from plugin (shell/cmd) 25039 1726867442.27610: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 25039 1726867442.27656: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 25039 1726867442.27725: Loaded config def from plugin (shell/powershell) 25039 1726867442.27727: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 25039 1726867442.27783: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 25039 1726867442.27959: Loaded config def from plugin (shell/sh) 25039 1726867442.27962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 25039 1726867442.27996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 25039 1726867442.28119: Loaded config def from plugin (become/runas) 25039 1726867442.28122: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 25039 1726867442.28293: Loaded config def from plugin (become/su) 25039 1726867442.28296: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 25039 1726867442.28446: Loaded config def from plugin (become/sudo) 25039 1726867442.28448: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 25039 1726867442.28482: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 25039 1726867442.28797: in VariableManager get_vars() 25039 1726867442.28821: done with get_vars() 25039 1726867442.28949: trying /usr/local/lib/python3.12/site-packages/ansible/modules 25039 1726867442.33140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 25039 1726867442.33255: in VariableManager get_vars() 25039 1726867442.33260: done with get_vars() 25039 1726867442.33263: variable 'playbook_dir' from source: magic vars 25039 1726867442.33264: variable 'ansible_playbook_python' from source: magic vars 25039 1726867442.33265: variable 'ansible_config_file' from source: magic vars 25039 1726867442.33266: variable 'groups' from source: magic vars 25039 1726867442.33267: variable 'omit' from source: magic vars 25039 1726867442.33267: variable 'ansible_version' from source: magic vars 25039 1726867442.33268: variable 'ansible_check_mode' from source: magic vars 25039 1726867442.33269: variable 'ansible_diff_mode' from source: magic vars 25039 1726867442.33270: variable 'ansible_forks' from source: magic vars 25039 1726867442.33270: variable 'ansible_inventory_sources' from source: magic vars 25039 1726867442.33271: variable 'ansible_skip_tags' from source: magic vars 25039 1726867442.33272: variable 'ansible_limit' from source: magic vars 25039 1726867442.33273: variable 'ansible_run_tags' from source: magic vars 25039 1726867442.33273: variable 'ansible_verbosity' from source: magic vars 25039 1726867442.33312: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 25039 1726867442.33887: in VariableManager get_vars() 25039 1726867442.33903: done with get_vars() 25039 1726867442.33944: in VariableManager get_vars() 25039 1726867442.33982: done with get_vars() 25039 1726867442.34263: in VariableManager get_vars() 25039 1726867442.34276: done with get_vars() 25039 1726867442.34283: variable 'omit' from source: magic vars 25039 1726867442.34301: variable 'omit' from source: magic vars 25039 1726867442.34336: in VariableManager get_vars() 25039 1726867442.34347: done with get_vars() 25039 1726867442.34395: in VariableManager get_vars() 25039 1726867442.34410: done with get_vars() 25039 1726867442.34447: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25039 1726867442.34657: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25039 1726867442.34786: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25039 1726867442.35435: in VariableManager get_vars() 25039 1726867442.35452: done with get_vars() 25039 1726867442.35858: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 25039 1726867442.36006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25039 1726867442.37870: in VariableManager get_vars() 25039 1726867442.37890: done with get_vars() 25039 1726867442.37930: in VariableManager get_vars() 25039 1726867442.37960: done with get_vars() 25039 1726867442.38800: in VariableManager get_vars() 25039 1726867442.38819: done with get_vars() 25039 1726867442.38824: variable 'omit' from source: magic vars 25039 1726867442.38835: variable 'omit' from source: magic vars 25039 1726867442.38871: in VariableManager get_vars() 25039 1726867442.38887: done with get_vars() 25039 1726867442.38908: in VariableManager get_vars() 25039 1726867442.38925: done with get_vars() 25039 1726867442.38954: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25039 1726867442.39068: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25039 1726867442.39235: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25039 1726867442.41251: in VariableManager get_vars() 25039 1726867442.41272: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25039 1726867442.44290: in VariableManager get_vars() 25039 1726867442.44310: done with get_vars() 25039 1726867442.44438: in VariableManager get_vars() 25039 1726867442.44457: done with get_vars() 25039 1726867442.44714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 25039 1726867442.44728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 25039 1726867442.45153: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 25039 1726867442.45518: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 25039 1726867442.45521: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 25039 1726867442.45552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 25039 1726867442.45576: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 25039 1726867442.45844: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 25039 1726867442.45904: Loaded config def from plugin (callback/default) 25039 1726867442.45907: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 25039 1726867442.47110: Loaded config def from plugin (callback/junit) 25039 1726867442.47112: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 25039 1726867442.47156: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 25039 1726867442.47297: Loaded config def from plugin (callback/minimal) 25039 1726867442.47299: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 25039 1726867442.47337: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 25039 1726867442.47396: Loaded config def from plugin (callback/tree) 25039 1726867442.47399: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 25039 1726867442.47516: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 25039 1726867442.47518: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 25039 1726867442.47544: in VariableManager get_vars() 25039 1726867442.47556: done with get_vars() 25039 1726867442.47561: in VariableManager get_vars() 25039 1726867442.47568: done with get_vars() 25039 1726867442.47572: variable 'omit' from source: magic vars 25039 1726867442.47631: in VariableManager get_vars() 25039 1726867442.47645: done with get_vars() 25039 1726867442.47666: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 25039 1726867442.48188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 25039 1726867442.48257: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 25039 1726867442.48288: getting the remaining hosts for this loop 25039 1726867442.48290: done getting the remaining hosts for this loop 25039 1726867442.48292: getting the next task for host managed_node1 25039 1726867442.48296: done getting next task for host managed_node1 25039 1726867442.48298: ^ task is: TASK: Gathering Facts 25039 1726867442.48299: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867442.48302: getting variables 25039 1726867442.48303: in VariableManager get_vars() 25039 1726867442.48312: Calling all_inventory to load vars for managed_node1 25039 1726867442.48314: Calling groups_inventory to load vars for managed_node1 25039 1726867442.48317: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867442.48328: Calling all_plugins_play to load vars for managed_node1 25039 1726867442.48338: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867442.48342: Calling groups_plugins_play to load vars for managed_node1 25039 1726867442.48373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867442.48426: done with get_vars() 25039 1726867442.48433: done getting variables 25039 1726867442.48495: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Friday 20 September 2024 17:24:02 -0400 (0:00:00.010) 0:00:00.010 ****** 25039 1726867442.48516: entering _queue_task() for managed_node1/gather_facts 25039 1726867442.48517: Creating lock for gather_facts 25039 1726867442.48858: worker is 1 (out of 1 available) 25039 1726867442.48867: exiting _queue_task() for managed_node1/gather_facts 25039 1726867442.48885: done queuing things up, now waiting for results queue to drain 25039 1726867442.48887: waiting for pending results... 25039 1726867442.49219: running TaskExecutor() for managed_node1/TASK: Gathering Facts 25039 1726867442.49225: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000b9 25039 1726867442.49228: variable 'ansible_search_path' from source: unknown 25039 1726867442.49265: calling self._execute() 25039 1726867442.49338: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867442.49348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867442.49362: variable 'omit' from source: magic vars 25039 1726867442.49651: variable 'omit' from source: magic vars 25039 1726867442.49685: variable 'omit' from source: magic vars 25039 1726867442.49728: variable 'omit' from source: magic vars 25039 1726867442.49785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867442.49828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867442.49859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867442.49884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867442.49901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867442.49933: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867442.49955: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867442.49958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867442.50051: Set connection var ansible_shell_executable to /bin/sh 25039 1726867442.50283: Set connection var ansible_timeout to 10 25039 1726867442.50286: Set connection var ansible_shell_type to sh 25039 1726867442.50288: Set connection var ansible_connection to ssh 25039 1726867442.50290: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867442.50292: Set connection var ansible_pipelining to False 25039 1726867442.50294: variable 'ansible_shell_executable' from source: unknown 25039 1726867442.50296: variable 'ansible_connection' from source: unknown 25039 1726867442.50298: variable 'ansible_module_compression' from source: unknown 25039 1726867442.50300: variable 'ansible_shell_type' from source: unknown 25039 1726867442.50302: variable 'ansible_shell_executable' from source: unknown 25039 1726867442.50304: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867442.50306: variable 'ansible_pipelining' from source: unknown 25039 1726867442.50307: variable 'ansible_timeout' from source: unknown 25039 1726867442.50309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867442.50385: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867442.50401: variable 'omit' from source: magic vars 25039 1726867442.50410: starting attempt loop 25039 1726867442.50423: running the handler 25039 1726867442.50447: variable 'ansible_facts' from source: unknown 25039 1726867442.50468: _low_level_execute_command(): starting 25039 1726867442.50482: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867442.51296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867442.51322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867442.51339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867442.51530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867442.53118: stdout chunk (state=3): >>>/root <<< 25039 1726867442.53345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867442.53348: stdout chunk (state=3): >>><<< 25039 1726867442.53351: stderr chunk (state=3): >>><<< 25039 1726867442.53396: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867442.53419: _low_level_execute_command(): starting 25039 1726867442.53559: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331 `" && echo ansible-tmp-1726867442.5340288-25077-279723507657331="` echo /root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331 `" ) && sleep 0' 25039 1726867442.54769: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867442.54882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867442.54962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867442.56903: stdout chunk (state=3): >>>ansible-tmp-1726867442.5340288-25077-279723507657331=/root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331 <<< 25039 1726867442.57081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867442.57085: stdout chunk (state=3): >>><<< 25039 1726867442.57087: stderr chunk (state=3): >>><<< 25039 1726867442.57106: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867442.5340288-25077-279723507657331=/root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867442.57147: variable 'ansible_module_compression' from source: unknown 25039 1726867442.57382: ANSIBALLZ: Using generic lock for ansible.legacy.setup 25039 1726867442.57385: ANSIBALLZ: Acquiring lock 25039 1726867442.57388: ANSIBALLZ: Lock acquired: 140682442827552 25039 1726867442.57390: ANSIBALLZ: Creating module 25039 1726867443.02730: ANSIBALLZ: Writing module into payload 25039 1726867443.02896: ANSIBALLZ: Writing module 25039 1726867443.02925: ANSIBALLZ: Renaming module 25039 1726867443.02943: ANSIBALLZ: Done creating module 25039 1726867443.02985: variable 'ansible_facts' from source: unknown 25039 1726867443.02998: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867443.03017: _low_level_execute_command(): starting 25039 1726867443.03045: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 25039 1726867443.03857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867443.03861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867443.03874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867443.03893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867443.03990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867443.05691: stdout chunk (state=3): >>>PLATFORM <<< 25039 1726867443.05783: stdout chunk (state=3): >>>Linux <<< 25039 1726867443.05794: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 25039 1726867443.06023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867443.06026: stdout chunk (state=3): >>><<< 25039 1726867443.06029: stderr chunk (state=3): >>><<< 25039 1726867443.06286: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867443.06290 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 25039 1726867443.06293: _low_level_execute_command(): starting 25039 1726867443.06295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 25039 1726867443.06848: Sending initial data 25039 1726867443.06851: Sent initial data (1181 bytes) 25039 1726867443.07892: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867443.07912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867443.08120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867443.08287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867443.11799: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 25039 1726867443.12133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867443.12137: stdout chunk (state=3): >>><<< 25039 1726867443.12143: stderr chunk (state=3): >>><<< 25039 1726867443.12157: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867443.12586: variable 'ansible_facts' from source: unknown 25039 1726867443.12590: variable 'ansible_facts' from source: unknown 25039 1726867443.12592: variable 'ansible_module_compression' from source: unknown 25039 1726867443.12594: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25039 1726867443.12597: variable 'ansible_facts' from source: unknown 25039 1726867443.12851: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/AnsiballZ_setup.py 25039 1726867443.13326: Sending initial data 25039 1726867443.13330: Sent initial data (154 bytes) 25039 1726867443.14594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867443.14709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867443.14918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867443.14962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867443.17148: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867443.17197: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867443.17289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpointbt_t /root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/AnsiballZ_setup.py <<< 25039 1726867443.17331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpointbt_t" to remote "/root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/AnsiballZ_setup.py" <<< 25039 1726867443.20086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867443.20105: stdout chunk (state=3): >>><<< 25039 1726867443.20123: stderr chunk (state=3): >>><<< 25039 1726867443.20228: done transferring module to remote 25039 1726867443.20439: _low_level_execute_command(): starting 25039 1726867443.20442: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/ /root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/AnsiballZ_setup.py && sleep 0' 25039 1726867443.21730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867443.21751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867443.22001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867443.22026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867443.22044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867443.22127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867443.24788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867443.24798: stdout chunk (state=3): >>><<< 25039 1726867443.24902: stderr chunk (state=3): >>><<< 25039 1726867443.24929: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25039 1726867443.24938: _low_level_execute_command(): starting 25039 1726867443.24948: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/AnsiballZ_setup.py && sleep 0' 25039 1726867443.25894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867443.25920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867443.26128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867443.26141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867443.26361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867443.26458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867443.26562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867443.29257: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 25039 1726867443.29361: stdout chunk (state=3): >>>import _imp # builtin <<< 25039 1726867443.29385: stdout chunk (state=3): >>>import '_thread' # <<< 25039 1726867443.29396: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 25039 1726867443.29596: stdout chunk (state=3): >>>import '_io' # <<< 25039 1726867443.29600: stdout chunk (state=3): >>>import 'marshal' # <<< 25039 1726867443.29624: stdout chunk (state=3): >>>import 'posix' # <<< 25039 1726867443.29768: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 25039 1726867443.29800: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 25039 1726867443.29834: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dbe7b30> <<< 25039 1726867443.29865: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dc1aa50> <<< 25039 1726867443.29967: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 25039 1726867443.29993: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 25039 1726867443.30214: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 25039 1726867443.30241: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da2d130> <<< 25039 1726867443.30705: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25039 1726867443.31214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da6bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da6bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 25039 1726867443.31320: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daa37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daa3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da83aa0> <<< 25039 1726867443.31343: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da811c0> <<< 25039 1726867443.31402: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da68f80> <<< 25039 1726867443.31441: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 25039 1726867443.31464: stdout chunk (state=3): >>>import '_sre' # <<< 25039 1726867443.31583: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dac3710> <<< 25039 1726867443.31623: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dac2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dac0b90> <<< 25039 1726867443.31757: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 25039 1726867443.31772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daf8740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da68200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1daf8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daf8aa0> <<< 25039 1726867443.31801: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1daf8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da66d20> <<< 25039 1726867443.31825: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 25039 1726867443.31856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 25039 1726867443.31894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 25039 1726867443.32061: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daf9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daf9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dafa480> import 'importlib.util' # import 'runpy' # <<< 25039 1726867443.32065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 25039 1726867443.32069: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 25039 1726867443.32120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1db10680> import 'errno' # <<< 25039 1726867443.32173: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1db11d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 25039 1726867443.32261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 25039 1726867443.32264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1db12c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1db13260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1db12150> <<< 25039 1726867443.32267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 25039 1726867443.32336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 25039 1726867443.32339: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.32345: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1db13ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1db13410> <<< 25039 1726867443.32411: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dafa4b0> <<< 25039 1726867443.32415: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 25039 1726867443.32455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 25039 1726867443.32572: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 25039 1726867443.32605: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d823bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d84c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d84c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 25039 1726867443.32681: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.32801: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d84d010> <<< 25039 1726867443.32929: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.32949: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d84da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d821d60> <<< 25039 1726867443.32984: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 25039 1726867443.33011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 25039 1726867443.33035: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 25039 1726867443.33051: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84ed80> <<< 25039 1726867443.33312: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84d880> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dafaba0> <<< 25039 1726867443.33316: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 25039 1726867443.33333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d877110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 25039 1726867443.33357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 25039 1726867443.33393: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d89b440> <<< 25039 1726867443.33432: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 25039 1726867443.33470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 25039 1726867443.33800: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d8fc1a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d8fe900> <<< 25039 1726867443.33897: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d8fc2c0> <<< 25039 1726867443.33965: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d8c9220> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 25039 1726867443.34074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d709250> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d89a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84fce0> <<< 25039 1726867443.34286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 25039 1726867443.34309: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1b1d7094f0> <<< 25039 1726867443.34685: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_qexumud8/ansible_ansible.legacy.setup_payload.zip' <<< 25039 1726867443.34716: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.35032: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 25039 1726867443.35054: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 25039 1726867443.35149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d76af90> <<< 25039 1726867443.35166: stdout chunk (state=3): >>>import '_typing' # <<< 25039 1726867443.35433: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d749e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d749010> <<< 25039 1726867443.35500: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 25039 1726867443.35587: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.35598: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 25039 1726867443.37265: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.38600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d768ce0> <<< 25039 1726867443.38609: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 25039 1726867443.38710: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d79e810> <<< 25039 1726867443.38756: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d79e5a0> <<< 25039 1726867443.38933: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d79deb0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d79e330> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d76bc20> import 'atexit' # <<< 25039 1726867443.38937: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d79f590> <<< 25039 1726867443.38989: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d79f7d0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 25039 1726867443.39075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 25039 1726867443.39108: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d79fd10> <<< 25039 1726867443.39264: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d129ac0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d12b6b0> <<< 25039 1726867443.39292: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 25039 1726867443.39314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 25039 1726867443.39362: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12bf50> <<< 25039 1726867443.39467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12d1f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 25039 1726867443.39529: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 25039 1726867443.39596: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12fce0> <<< 25039 1726867443.39675: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d74b080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12dfa0> <<< 25039 1726867443.39690: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 25039 1726867443.39712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 25039 1726867443.39790: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 25039 1726867443.39793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 25039 1726867443.40012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d137bf0> import '_tokenize' # <<< 25039 1726867443.40094: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d1366c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d136450> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 25039 1726867443.40219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 25039 1726867443.40308: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d136990> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12e4b0> <<< 25039 1726867443.40527: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d17bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17b890> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 25039 1726867443.40569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 25039 1726867443.40623: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867443.40642: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d17d9d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17d7c0><<< 25039 1726867443.40687: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 25039 1726867443.40758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 25039 1726867443.40784: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.40813: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d17fec0> <<< 25039 1726867443.40859: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17e030> <<< 25039 1726867443.40970: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 25039 1726867443.41102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 25039 1726867443.41105: stdout chunk (state=3): >>> <<< 25039 1726867443.41152: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d183500> <<< 25039 1726867443.41717: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17ff80> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d1843e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d184620> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d184920> <<< 25039 1726867443.41720: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17c110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 25039 1726867443.41744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 25039 1726867443.41786: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867443.41829: stdout chunk (state=3): >>> # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867443.41844: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d187fe0><<< 25039 1726867443.41953: stdout chunk (state=3): >>> <<< 25039 1726867443.42091: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867443.42143: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.42194: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d00d0d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d186780> <<< 25039 1726867443.42223: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867443.42363: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d187b30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d1863f0> <<< 25039 1726867443.42386: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 25039 1726867443.42496: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.42648: stdout chunk (state=3): >>> # zipimport: zlib available<<< 25039 1726867443.42855: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 25039 1726867443.42993: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.43105: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.43120: stdout chunk (state=3): >>> <<< 25039 1726867443.44036: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.44058: stdout chunk (state=3): >>> <<< 25039 1726867443.44953: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 25039 1726867443.44997: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 25039 1726867443.45097: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 25039 1726867443.45299: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d0153d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 25039 1726867443.45312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 25039 1726867443.45332: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d016150><<< 25039 1726867443.45359: stdout chunk (state=3): >>> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d00d4f0><<< 25039 1726867443.45402: stdout chunk (state=3): >>> <<< 25039 1726867443.45538: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 25039 1726867443.45541: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 25039 1726867443.45563: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.45757: stdout chunk (state=3): >>> <<< 25039 1726867443.45813: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.46079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 25039 1726867443.46095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 25039 1726867443.46154: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0167b0> <<< 25039 1726867443.46190: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.46874: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.47662: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.47665: stdout chunk (state=3): >>> <<< 25039 1726867443.47751: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.47774: stdout chunk (state=3): >>> <<< 25039 1726867443.47926: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 25039 1726867443.47931: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.47984: stdout chunk (state=3): >>> <<< 25039 1726867443.47987: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.47989: stdout chunk (state=3): >>> <<< 25039 1726867443.48083: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available<<< 25039 1726867443.48087: stdout chunk (state=3): >>> <<< 25039 1726867443.48180: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.48407: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 25039 1726867443.48494: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 25039 1726867443.48525: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.48901: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.48927: stdout chunk (state=3): >>> <<< 25039 1726867443.49346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 25039 1726867443.49384: stdout chunk (state=3): >>>import '_ast' # <<< 25039 1726867443.49401: stdout chunk (state=3): >>> <<< 25039 1726867443.49493: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d017320> <<< 25039 1726867443.49515: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.49632: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.49755: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 25039 1726867443.49762: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 25039 1726867443.49795: stdout chunk (state=3): >>> import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 25039 1726867443.49887: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 25039 1726867443.49898: stdout chunk (state=3): >>> <<< 25039 1726867443.49952: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 25039 1726867443.49982: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.50055: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.50114: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.50160: stdout chunk (state=3): >>> <<< 25039 1726867443.50220: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.50334: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 25039 1726867443.50553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867443.50557: stdout chunk (state=3): >>> <<< 25039 1726867443.50581: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d021c40> <<< 25039 1726867443.50626: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d01edb0> <<< 25039 1726867443.50685: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 25039 1726867443.50756: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 25039 1726867443.50789: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.50815: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.51064: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 25039 1726867443.51098: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 25039 1726867443.51132: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 25039 1726867443.51158: stdout chunk (state=3): >>> <<< 25039 1726867443.51224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 25039 1726867443.51288: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 25039 1726867443.51390: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d10a600> <<< 25039 1726867443.51461: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d1fe2d0> <<< 25039 1726867443.51597: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d021d00> <<< 25039 1726867443.51685: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d017d40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 25039 1726867443.51761: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 25039 1726867443.51871: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 25039 1726867443.51929: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 25039 1726867443.51951: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.52104: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 25039 1726867443.52143: stdout chunk (state=3): >>> <<< 25039 1726867443.52173: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.52210: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.52248: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.52269: stdout chunk (state=3): >>> <<< 25039 1726867443.52318: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.52329: stdout chunk (state=3): >>> <<< 25039 1726867443.52390: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.52428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 25039 1726867443.52457: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.52654: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.52906: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 25039 1726867443.53090: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.53369: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.53432: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.53459: stdout chunk (state=3): >>> <<< 25039 1726867443.53516: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 25039 1726867443.53537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc'<<< 25039 1726867443.53578: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 25039 1726867443.53631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 25039 1726867443.53679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc'<<< 25039 1726867443.53708: stdout chunk (state=3): >>> import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b5b20> <<< 25039 1726867443.53747: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 25039 1726867443.53801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 25039 1726867443.53868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc'<<< 25039 1726867443.53905: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 25039 1726867443.53950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc'<<< 25039 1726867443.54054: stdout chunk (state=3): >>> import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc5bcb0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.54361: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cc5bfe0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d09ed80> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b6690> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b4200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b7e00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc'<<< 25039 1726867443.54364: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 25039 1726867443.54366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 25039 1726867443.54397: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py<<< 25039 1726867443.54426: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc'<<< 25039 1726867443.54451: stdout chunk (state=3): >>> <<< 25039 1726867443.54483: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.54561: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cc72f30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc727e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cc72990><<< 25039 1726867443.54589: stdout chunk (state=3): >>> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc71c40><<< 25039 1726867443.54606: stdout chunk (state=3): >>> <<< 25039 1726867443.54656: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py<<< 25039 1726867443.54762: stdout chunk (state=3): >>> <<< 25039 1726867443.54926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc73020> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 25039 1726867443.54993: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.55005: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1ccc9b50> <<< 25039 1726867443.55048: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc73b30> <<< 25039 1726867443.55107: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b7ef0> import 'ansible.module_utils.facts.timeout' # <<< 25039 1726867443.55140: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.collector' # <<< 25039 1726867443.55170: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.55206: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.55234: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 25039 1726867443.55318: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.55411: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other.facter' # <<< 25039 1726867443.55451: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.55454: stdout chunk (state=3): >>> <<< 25039 1726867443.55533: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.55536: stdout chunk (state=3): >>> <<< 25039 1726867443.55635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 25039 1726867443.55735: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 25039 1726867443.55792: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 25039 1726867443.55816: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.55871: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.55982: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 25039 1726867443.55985: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.56051: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.56116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 25039 1726867443.56132: stdout chunk (state=3): >>> <<< 25039 1726867443.56222: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 25039 1726867443.56238: stdout chunk (state=3): >>> <<< 25039 1726867443.56313: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.56322: stdout chunk (state=3): >>> <<< 25039 1726867443.56498: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # <<< 25039 1726867443.56517: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 25039 1726867443.56542: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.56546: stdout chunk (state=3): >>> <<< 25039 1726867443.57313: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.57463: stdout chunk (state=3): >>> <<< 25039 1726867443.58133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 25039 1726867443.58136: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25039 1726867443.58248: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.58267: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.58284: stdout chunk (state=3): >>> <<< 25039 1726867443.58357: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 25039 1726867443.58379: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.58449: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.58463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 25039 1726867443.58542: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.58576: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.58668: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 25039 1726867443.58689: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.58763: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.58799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 25039 1726867443.58822: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.58910: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 25039 1726867443.58933: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.59063: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.59289: stdout chunk (state=3): >>> <<< 25039 1726867443.59298: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cccb830> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 25039 1726867443.59343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 25039 1726867443.59537: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1ccca690> <<< 25039 1726867443.59556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 25039 1726867443.59698: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25039 1726867443.59793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 25039 1726867443.59847: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.59850: stdout chunk (state=3): >>> <<< 25039 1726867443.59956: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.59998: stdout chunk (state=3): >>> <<< 25039 1726867443.60111: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 25039 1726867443.60189: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.60221: stdout chunk (state=3): >>> # zipimport: zlib available<<< 25039 1726867443.60269: stdout chunk (state=3): >>> <<< 25039 1726867443.60337: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 25039 1726867443.60362: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.60370: stdout chunk (state=3): >>> <<< 25039 1726867443.60429: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.60509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 25039 1726867443.60587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 25039 1726867443.60697: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.60798: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867443.60854: stdout chunk (state=3): >>> import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cd09e20> <<< 25039 1726867443.61130: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cce8170> <<< 25039 1726867443.61151: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 25039 1726867443.61182: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.61260: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.61343: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.selinux' # <<< 25039 1726867443.61384: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.61403: stdout chunk (state=3): >>> <<< 25039 1726867443.61519: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.61646: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.61829: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.62085: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 25039 1726867443.62089: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.service_mgr' # <<< 25039 1726867443.62106: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.62159: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.62260: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 25039 1726867443.62482: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 25039 1726867443.62544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cd118b0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cd09c10> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 25039 1726867443.62566: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 25039 1726867443.62587: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.62739: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 25039 1726867443.63075: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.63210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 25039 1726867443.63250: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.63444: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.63565: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867443.63642: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.63751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 25039 1726867443.63821: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25039 1726867443.64036: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.64063: stdout chunk (state=3): >>> <<< 25039 1726867443.64246: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 25039 1726867443.64275: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 25039 1726867443.64300: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.64485: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.64563: stdout chunk (state=3): >>> <<< 25039 1726867443.64701: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 25039 1726867443.64725: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.64786: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.64851: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.64865: stdout chunk (state=3): >>> <<< 25039 1726867443.65794: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.65958: stdout chunk (state=3): >>> <<< 25039 1726867443.66592: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 25039 1726867443.66598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 25039 1726867443.66600: stdout chunk (state=3): >>> <<< 25039 1726867443.66624: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.66795: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.66974: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 25039 1726867443.66982: stdout chunk (state=3): >>> <<< 25039 1726867443.67010: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.67154: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.67164: stdout chunk (state=3): >>> <<< 25039 1726867443.67326: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 25039 1726867443.67350: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.67593: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.67833: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 25039 1726867443.67866: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.67902: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 25039 1726867443.67928: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867443.68096: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 25039 1726867443.68100: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.68238: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.68402: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.68635: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.68755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 25039 1726867443.68784: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.68803: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.68840: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 25039 1726867443.68860: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.68892: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 25039 1726867443.68952: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.68975: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.69053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 25039 1726867443.69119: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.69147: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 25039 1726867443.69227: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.69231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 25039 1726867443.69258: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.69315: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.69355: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 25039 1726867443.69392: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.69879: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.70154: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 25039 1726867443.70157: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.70240: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.70329: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 25039 1726867443.70332: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.70379: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.70432: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 25039 1726867443.70558: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 25039 1726867443.70606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 25039 1726867443.70751: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.70899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 25039 1726867443.70929: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.70991: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 25039 1726867443.70994: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71020: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71044: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71129: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71200: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71301: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71397: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 25039 1726867443.71453: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71481: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71548: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 25039 1726867443.71560: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.71871: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.72137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 25039 1726867443.72141: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.72347: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 25039 1726867443.72351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 25039 1726867443.72368: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.72460: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.72701: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 25039 1726867443.72886: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867443.72890: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 25039 1726867443.73004: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 25039 1726867443.73025: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 25039 1726867443.73048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 25039 1726867443.73062: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867443.73087: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cb0e1e0> <<< 25039 1726867443.73140: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb0c5f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb07c80> <<< 25039 1726867443.85631: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb54080> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb54d70> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 25039 1726867443.85685: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb570b0> <<< 25039 1726867443.85693: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb55ca0> <<< 25039 1726867443.86035: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 25039 1726867444.11793: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.5048828125, "5m": 0.40283203125, "15m": 0.21875}, "ansible_fips": false, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_<<< 25039 1726867444.11884: stdout chunk (state=3): >>>processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2946, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 585, "free": 2946}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 688, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794164736, "block_size": 4096, "block_total": 65519099, "block_available": 63914591, "block_used": 1604508, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "24", "second": "04", "epoch": "1726867444", "epoch_int": "1726867444", "date": "2024-09-20", "time": "17:24:04", "iso8601_micro": "2024-09-20T21:24:04.058510Z", "iso8601": "2024-09-20T21:24:04Z", "iso8601_basic": "20240920T172404058510", "iso8601_basic_short": "20240920T172404", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f<<< 25039 1726867444.11888: stdout chunk (state=3): >>>", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25039 1726867444.12586: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 25039 1726867444.12625: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path<<< 25039 1726867444.12629: stdout chunk (state=3): >>> # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin<<< 25039 1726867444.12650: stdout chunk (state=3): >>> # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io<<< 25039 1726867444.12686: stdout chunk (state=3): >>> # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath<<< 25039 1726867444.12719: stdout chunk (state=3): >>> # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 25039 1726867444.12751: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler<<< 25039 1726867444.12762: stdout chunk (state=3): >>> # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util<<< 25039 1726867444.12794: stdout chunk (state=3): >>> # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect<<< 25039 1726867444.12837: stdout chunk (state=3): >>> # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib<<< 25039 1726867444.12862: stdout chunk (state=3): >>> # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils<<< 25039 1726867444.12887: stdout chunk (state=3): >>> # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd <<< 25039 1726867444.12909: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd<<< 25039 1726867444.12979: stdout chunk (state=3): >>> # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec <<< 25039 1726867444.13012: stdout chunk (state=3): >>># destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale <<< 25039 1726867444.13051: stdout chunk (state=3): >>># cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file<<< 25039 1726867444.13068: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing<<< 25039 1726867444.13093: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq <<< 25039 1726867444.13133: stdout chunk (state=3): >>># cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot <<< 25039 1726867444.13136: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb<<< 25039 1726867444.13174: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user<<< 25039 1726867444.13213: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base<<< 25039 1726867444.13220: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme<<< 25039 1726867444.13248: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd<<< 25039 1726867444.13313: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter<<< 25039 1726867444.13317: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils<<< 25039 1726867444.13341: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd<<< 25039 1726867444.13358: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd<<< 25039 1726867444.13573: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 25039 1726867444.13925: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 25039 1726867444.13930: stdout chunk (state=3): >>> <<< 25039 1726867444.13932: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util<<< 25039 1726867444.13950: stdout chunk (state=3): >>> <<< 25039 1726867444.14008: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression <<< 25039 1726867444.14074: stdout chunk (state=3): >>># destroy _lzma # destroy _blake2 # destroy binascii<<< 25039 1726867444.14148: stdout chunk (state=3): >>> # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 25039 1726867444.14164: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 25039 1726867444.14254: stdout chunk (state=3): >>> # destroy ntpath<<< 25039 1726867444.14295: stdout chunk (state=3): >>> # destroy importlib <<< 25039 1726867444.14349: stdout chunk (state=3): >>># destroy zipimport<<< 25039 1726867444.14352: stdout chunk (state=3): >>> # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp<<< 25039 1726867444.14390: stdout chunk (state=3): >>> # destroy encodings # destroy _locale # destroy locale <<< 25039 1726867444.14396: stdout chunk (state=3): >>># destroy select # destroy _signal # destroy _posixsubprocess<<< 25039 1726867444.14417: stdout chunk (state=3): >>> # destroy syslog # destroy uuid<<< 25039 1726867444.14486: stdout chunk (state=3): >>> # destroy selinux<<< 25039 1726867444.14489: stdout chunk (state=3): >>> <<< 25039 1726867444.14521: stdout chunk (state=3): >>># destroy shutil # destroy distro <<< 25039 1726867444.14555: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy logging <<< 25039 1726867444.14594: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 25039 1726867444.14649: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool<<< 25039 1726867444.14653: stdout chunk (state=3): >>> # destroy signal # destroy pickle<<< 25039 1726867444.14668: stdout chunk (state=3): >>> # destroy _compat_pickle <<< 25039 1726867444.14700: stdout chunk (state=3): >>># destroy _pickle # destroy queue<<< 25039 1726867444.14718: stdout chunk (state=3): >>> # destroy _heapq # destroy _queue # destroy multiprocessing.reduction<<< 25039 1726867444.14752: stdout chunk (state=3): >>> # destroy selectors # destroy shlex # destroy fcntl<<< 25039 1726867444.14781: stdout chunk (state=3): >>> # destroy datetime # destroy subprocess<<< 25039 1726867444.14803: stdout chunk (state=3): >>> # destroy base64<<< 25039 1726867444.14836: stdout chunk (state=3): >>> # destroy _ssl<<< 25039 1726867444.14845: stdout chunk (state=3): >>> <<< 25039 1726867444.14872: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd<<< 25039 1726867444.14898: stdout chunk (state=3): >>> # destroy termios # destroy json<<< 25039 1726867444.14938: stdout chunk (state=3): >>> # destroy socket<<< 25039 1726867444.14969: stdout chunk (state=3): >>> # destroy struct # destroy glob<<< 25039 1726867444.14995: stdout chunk (state=3): >>> # destroy fnmatch<<< 25039 1726867444.15006: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context<<< 25039 1726867444.15081: stdout chunk (state=3): >>> # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 25039 1726867444.15114: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 25039 1726867444.15117: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser<<< 25039 1726867444.15153: stdout chunk (state=3): >>> # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 25039 1726867444.15184: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 25039 1726867444.15204: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 25039 1726867444.15235: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 25039 1726867444.15254: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 25039 1726867444.15392: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 25039 1726867444.15565: stdout chunk (state=3): >>># destroy sys.monitoring <<< 25039 1726867444.15612: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 25039 1726867444.15676: stdout chunk (state=3): >>># destroy platform # destroy _uuid<<< 25039 1726867444.15686: stdout chunk (state=3): >>> <<< 25039 1726867444.15688: stdout chunk (state=3): >>># destroy stat<<< 25039 1726867444.15691: stdout chunk (state=3): >>> <<< 25039 1726867444.15745: stdout chunk (state=3): >>># destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg<<< 25039 1726867444.15805: stdout chunk (state=3): >>> # destroy contextlib # destroy _typing<<< 25039 1726867444.15820: stdout chunk (state=3): >>> # destroy _tokenize <<< 25039 1726867444.15879: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp<<< 25039 1726867444.15914: stdout chunk (state=3): >>> # destroy _io # destroy marshal # clear sys.meta_path<<< 25039 1726867444.15917: stdout chunk (state=3): >>> <<< 25039 1726867444.15938: stdout chunk (state=3): >>># clear sys.modules<<< 25039 1726867444.16044: stdout chunk (state=3): >>> # destroy _frozen_importlib # destroy codecs <<< 25039 1726867444.16104: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 <<< 25039 1726867444.16123: stdout chunk (state=3): >>># destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs<<< 25039 1726867444.16144: stdout chunk (state=3): >>> <<< 25039 1726867444.16181: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 25039 1726867444.16225: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 25039 1726867444.16233: stdout chunk (state=3): >>> # destroy _random<<< 25039 1726867444.16258: stdout chunk (state=3): >>> # destroy _weakref<<< 25039 1726867444.16304: stdout chunk (state=3): >>> # destroy _hashlib<<< 25039 1726867444.16309: stdout chunk (state=3): >>> <<< 25039 1726867444.16315: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re<<< 25039 1726867444.16365: stdout chunk (state=3): >>> # destroy itertools # destroy _abc<<< 25039 1726867444.16371: stdout chunk (state=3): >>> # destroy posix # destroy _functools # destroy builtins<<< 25039 1726867444.16390: stdout chunk (state=3): >>> # destroy _thread # clear sys.audit hooks<<< 25039 1726867444.16467: stdout chunk (state=3): >>> <<< 25039 1726867444.17064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867444.17067: stdout chunk (state=3): >>><<< 25039 1726867444.17069: stderr chunk (state=3): >>><<< 25039 1726867444.17307: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dbe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dc1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da6bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da6bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daa37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daa3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da83aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da811c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da68f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dac3710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dac2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dac0b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daf8740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da68200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1daf8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daf8aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1daf8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1da66d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daf9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1daf9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dafa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1db10680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1db11d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1db12c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1db13260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1db12150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1db13ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1db13410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dafa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d823bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d84c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d84c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d84d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d84da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d821d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84ed80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84d880> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1dafaba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d877110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d89b440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d8fc1a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d8fe900> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d8fc2c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d8c9220> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d709250> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d89a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d84fce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1b1d7094f0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_qexumud8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d76af90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d749e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d749010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d768ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d79e810> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d79e5a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d79deb0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d79e330> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d76bc20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d79f590> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d79f7d0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d79fd10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d129ac0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d12b6b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12bf50> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12d1f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12fce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d74b080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12dfa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d137bf0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d1366c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d136450> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d136990> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d12e4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d17bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17b890> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d17d9d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d17fec0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17e030> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d183500> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17ff80> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d1843e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d184620> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d184920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d17c110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d187fe0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d00d0d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d186780> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d187b30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d1863f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d0153d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d016150> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d00d4f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0167b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d017320> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1d021c40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d01edb0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d10a600> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d1fe2d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d021d00> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d017d40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b5b20> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc5bcb0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cc5bfe0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d09ed80> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b6690> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b4200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b7e00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cc72f30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc727e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cc72990> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc71c40> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc73020> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1ccc9b50> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cc73b30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1d0b7ef0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cccb830> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1ccca690> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cd09e20> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cce8170> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cd118b0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cd09c10> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1b1cb0e1e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb0c5f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb07c80> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb54080> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb54d70> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb570b0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1b1cb55ca0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.5048828125, "5m": 0.40283203125, "15m": 0.21875}, "ansible_fips": false, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2946, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 585, "free": 2946}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 688, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794164736, "block_size": 4096, "block_total": 65519099, "block_available": 63914591, "block_used": 1604508, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "24", "second": "04", "epoch": "1726867444", "epoch_int": "1726867444", "date": "2024-09-20", "time": "17:24:04", "iso8601_micro": "2024-09-20T21:24:04.058510Z", "iso8601": "2024-09-20T21:24:04Z", "iso8601_basic": "20240920T172404058510", "iso8601_basic_short": "20240920T172404", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 25039 1726867444.19584: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867444.19664: _low_level_execute_command(): starting 25039 1726867444.19668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867442.5340288-25077-279723507657331/ > /dev/null 2>&1 && sleep 0' 25039 1726867444.20281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867444.20297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867444.20316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867444.20335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867444.20361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867444.20392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867444.20470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867444.20493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867444.20512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867444.20597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867444.22983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867444.22986: stdout chunk (state=3): >>><<< 25039 1726867444.22989: stderr chunk (state=3): >>><<< 25039 1726867444.22992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867444.22994: handler run complete 25039 1726867444.23012: variable 'ansible_facts' from source: unknown 25039 1726867444.23115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867444.23457: variable 'ansible_facts' from source: unknown 25039 1726867444.23545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867444.23681: attempt loop complete, returning result 25039 1726867444.23685: _execute() done 25039 1726867444.23687: dumping result to json 25039 1726867444.23727: done dumping result, returning 25039 1726867444.23734: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcac9-a3a5-3ddc-7272-0000000000b9] 25039 1726867444.23738: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000b9 25039 1726867444.24450: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000b9 25039 1726867444.24454: WORKER PROCESS EXITING ok: [managed_node1] 25039 1726867444.25312: no more pending results, returning what we have 25039 1726867444.25315: results queue empty 25039 1726867444.25316: checking for any_errors_fatal 25039 1726867444.25317: done checking for any_errors_fatal 25039 1726867444.25318: checking for max_fail_percentage 25039 1726867444.25319: done checking for max_fail_percentage 25039 1726867444.25320: checking to see if all hosts have failed and the running result is not ok 25039 1726867444.25321: done checking to see if all hosts have failed 25039 1726867444.25322: getting the remaining hosts for this loop 25039 1726867444.25324: done getting the remaining hosts for this loop 25039 1726867444.25327: getting the next task for host managed_node1 25039 1726867444.25333: done getting next task for host managed_node1 25039 1726867444.25335: ^ task is: TASK: meta (flush_handlers) 25039 1726867444.25337: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867444.25341: getting variables 25039 1726867444.25342: in VariableManager get_vars() 25039 1726867444.25363: Calling all_inventory to load vars for managed_node1 25039 1726867444.25366: Calling groups_inventory to load vars for managed_node1 25039 1726867444.25370: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867444.25675: Calling all_plugins_play to load vars for managed_node1 25039 1726867444.25680: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867444.25684: Calling groups_plugins_play to load vars for managed_node1 25039 1726867444.26464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867444.26792: done with get_vars() 25039 1726867444.26804: done getting variables 25039 1726867444.26868: in VariableManager get_vars() 25039 1726867444.26880: Calling all_inventory to load vars for managed_node1 25039 1726867444.26883: Calling groups_inventory to load vars for managed_node1 25039 1726867444.26885: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867444.26890: Calling all_plugins_play to load vars for managed_node1 25039 1726867444.26892: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867444.26895: Calling groups_plugins_play to load vars for managed_node1 25039 1726867444.27195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867444.27394: done with get_vars() 25039 1726867444.27408: done queuing things up, now waiting for results queue to drain 25039 1726867444.27411: results queue empty 25039 1726867444.27412: checking for any_errors_fatal 25039 1726867444.27414: done checking for any_errors_fatal 25039 1726867444.27419: checking for max_fail_percentage 25039 1726867444.27420: done checking for max_fail_percentage 25039 1726867444.27421: checking to see if all hosts have failed and the running result is not ok 25039 1726867444.27422: done checking to see if all hosts have failed 25039 1726867444.27423: getting the remaining hosts for this loop 25039 1726867444.27423: done getting the remaining hosts for this loop 25039 1726867444.27426: getting the next task for host managed_node1 25039 1726867444.27431: done getting next task for host managed_node1 25039 1726867444.27434: ^ task is: TASK: Include the task 'el_repo_setup.yml' 25039 1726867444.27435: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867444.27437: getting variables 25039 1726867444.27438: in VariableManager get_vars() 25039 1726867444.27447: Calling all_inventory to load vars for managed_node1 25039 1726867444.27449: Calling groups_inventory to load vars for managed_node1 25039 1726867444.27451: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867444.27468: Calling all_plugins_play to load vars for managed_node1 25039 1726867444.27471: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867444.27474: Calling groups_plugins_play to load vars for managed_node1 25039 1726867444.27639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867444.27852: done with get_vars() 25039 1726867444.27861: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Friday 20 September 2024 17:24:04 -0400 (0:00:01.794) 0:00:01.805 ****** 25039 1726867444.27953: entering _queue_task() for managed_node1/include_tasks 25039 1726867444.27955: Creating lock for include_tasks 25039 1726867444.28409: worker is 1 (out of 1 available) 25039 1726867444.28420: exiting _queue_task() for managed_node1/include_tasks 25039 1726867444.28431: done queuing things up, now waiting for results queue to drain 25039 1726867444.28434: waiting for pending results... 25039 1726867444.28678: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 25039 1726867444.28683: in run() - task 0affcac9-a3a5-3ddc-7272-000000000006 25039 1726867444.28697: variable 'ansible_search_path' from source: unknown 25039 1726867444.28776: calling self._execute() 25039 1726867444.28812: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867444.28824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867444.28836: variable 'omit' from source: magic vars 25039 1726867444.29003: _execute() done 25039 1726867444.29013: dumping result to json 25039 1726867444.29183: done dumping result, returning 25039 1726867444.29187: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-3ddc-7272-000000000006] 25039 1726867444.29189: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000006 25039 1726867444.29449: no more pending results, returning what we have 25039 1726867444.29455: in VariableManager get_vars() 25039 1726867444.29485: Calling all_inventory to load vars for managed_node1 25039 1726867444.29487: Calling groups_inventory to load vars for managed_node1 25039 1726867444.29491: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867444.29503: Calling all_plugins_play to load vars for managed_node1 25039 1726867444.29505: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867444.29508: Calling groups_plugins_play to load vars for managed_node1 25039 1726867444.29902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867444.30320: done with get_vars() 25039 1726867444.30328: variable 'ansible_search_path' from source: unknown 25039 1726867444.30341: we have included files to process 25039 1726867444.30342: generating all_blocks data 25039 1726867444.30494: done generating all_blocks data 25039 1726867444.30496: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25039 1726867444.30502: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25039 1726867444.30508: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000006 25039 1726867444.30511: WORKER PROCESS EXITING 25039 1726867444.30514: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25039 1726867444.31446: in VariableManager get_vars() 25039 1726867444.31463: done with get_vars() 25039 1726867444.31573: done processing included file 25039 1726867444.31575: iterating over new_blocks loaded from include file 25039 1726867444.31578: in VariableManager get_vars() 25039 1726867444.31589: done with get_vars() 25039 1726867444.31590: filtering new block on tags 25039 1726867444.31604: done filtering new block on tags 25039 1726867444.31607: in VariableManager get_vars() 25039 1726867444.31616: done with get_vars() 25039 1726867444.31617: filtering new block on tags 25039 1726867444.31631: done filtering new block on tags 25039 1726867444.31634: in VariableManager get_vars() 25039 1726867444.31643: done with get_vars() 25039 1726867444.31644: filtering new block on tags 25039 1726867444.31808: done filtering new block on tags 25039 1726867444.31811: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 25039 1726867444.31816: extending task lists for all hosts with included blocks 25039 1726867444.31860: done extending task lists 25039 1726867444.31862: done processing included files 25039 1726867444.31863: results queue empty 25039 1726867444.31863: checking for any_errors_fatal 25039 1726867444.31865: done checking for any_errors_fatal 25039 1726867444.31865: checking for max_fail_percentage 25039 1726867444.31867: done checking for max_fail_percentage 25039 1726867444.31867: checking to see if all hosts have failed and the running result is not ok 25039 1726867444.31868: done checking to see if all hosts have failed 25039 1726867444.31869: getting the remaining hosts for this loop 25039 1726867444.31870: done getting the remaining hosts for this loop 25039 1726867444.31987: getting the next task for host managed_node1 25039 1726867444.31992: done getting next task for host managed_node1 25039 1726867444.31994: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 25039 1726867444.31997: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867444.31999: getting variables 25039 1726867444.32000: in VariableManager get_vars() 25039 1726867444.32008: Calling all_inventory to load vars for managed_node1 25039 1726867444.32010: Calling groups_inventory to load vars for managed_node1 25039 1726867444.32013: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867444.32018: Calling all_plugins_play to load vars for managed_node1 25039 1726867444.32020: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867444.32023: Calling groups_plugins_play to load vars for managed_node1 25039 1726867444.32266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867444.32623: done with get_vars() 25039 1726867444.32773: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:24:04 -0400 (0:00:00.049) 0:00:01.854 ****** 25039 1726867444.32874: entering _queue_task() for managed_node1/setup 25039 1726867444.33232: worker is 1 (out of 1 available) 25039 1726867444.33242: exiting _queue_task() for managed_node1/setup 25039 1726867444.33255: done queuing things up, now waiting for results queue to drain 25039 1726867444.33256: waiting for pending results... 25039 1726867444.33485: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 25039 1726867444.33692: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000ca 25039 1726867444.33799: variable 'ansible_search_path' from source: unknown 25039 1726867444.33808: variable 'ansible_search_path' from source: unknown 25039 1726867444.33859: calling self._execute() 25039 1726867444.34063: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867444.34083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867444.34098: variable 'omit' from source: magic vars 25039 1726867444.35041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867444.37617: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867444.37714: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867444.37766: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867444.37823: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867444.37867: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867444.37955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867444.38000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867444.38083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867444.38086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867444.38110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867444.38286: variable 'ansible_facts' from source: unknown 25039 1726867444.38361: variable 'network_test_required_facts' from source: task vars 25039 1726867444.38404: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 25039 1726867444.38415: when evaluation is False, skipping this task 25039 1726867444.38423: _execute() done 25039 1726867444.38439: dumping result to json 25039 1726867444.38543: done dumping result, returning 25039 1726867444.38546: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-3ddc-7272-0000000000ca] 25039 1726867444.38549: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ca 25039 1726867444.38618: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ca 25039 1726867444.38621: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 25039 1726867444.38713: no more pending results, returning what we have 25039 1726867444.38717: results queue empty 25039 1726867444.38718: checking for any_errors_fatal 25039 1726867444.38720: done checking for any_errors_fatal 25039 1726867444.38720: checking for max_fail_percentage 25039 1726867444.38722: done checking for max_fail_percentage 25039 1726867444.38723: checking to see if all hosts have failed and the running result is not ok 25039 1726867444.38724: done checking to see if all hosts have failed 25039 1726867444.38724: getting the remaining hosts for this loop 25039 1726867444.38726: done getting the remaining hosts for this loop 25039 1726867444.38729: getting the next task for host managed_node1 25039 1726867444.38740: done getting next task for host managed_node1 25039 1726867444.38742: ^ task is: TASK: Check if system is ostree 25039 1726867444.38745: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867444.38748: getting variables 25039 1726867444.38749: in VariableManager get_vars() 25039 1726867444.38776: Calling all_inventory to load vars for managed_node1 25039 1726867444.38780: Calling groups_inventory to load vars for managed_node1 25039 1726867444.38784: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867444.38796: Calling all_plugins_play to load vars for managed_node1 25039 1726867444.38799: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867444.38802: Calling groups_plugins_play to load vars for managed_node1 25039 1726867444.39288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867444.39498: done with get_vars() 25039 1726867444.39510: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:24:04 -0400 (0:00:00.067) 0:00:01.921 ****** 25039 1726867444.39605: entering _queue_task() for managed_node1/stat 25039 1726867444.39860: worker is 1 (out of 1 available) 25039 1726867444.39990: exiting _queue_task() for managed_node1/stat 25039 1726867444.39998: done queuing things up, now waiting for results queue to drain 25039 1726867444.40000: waiting for pending results... 25039 1726867444.40210: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 25039 1726867444.40284: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000cc 25039 1726867444.40287: variable 'ansible_search_path' from source: unknown 25039 1726867444.40290: variable 'ansible_search_path' from source: unknown 25039 1726867444.40416: calling self._execute() 25039 1726867444.40419: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867444.40423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867444.40436: variable 'omit' from source: magic vars 25039 1726867444.40950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867444.41259: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867444.41325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867444.41365: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867444.41603: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867444.41676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867444.41821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867444.41833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867444.41836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867444.41915: Evaluated conditional (not __network_is_ostree is defined): True 25039 1726867444.41927: variable 'omit' from source: magic vars 25039 1726867444.41975: variable 'omit' from source: magic vars 25039 1726867444.42019: variable 'omit' from source: magic vars 25039 1726867444.42059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867444.42093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867444.42120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867444.42141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867444.42165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867444.42270: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867444.42273: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867444.42276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867444.42324: Set connection var ansible_shell_executable to /bin/sh 25039 1726867444.42336: Set connection var ansible_timeout to 10 25039 1726867444.42345: Set connection var ansible_shell_type to sh 25039 1726867444.42351: Set connection var ansible_connection to ssh 25039 1726867444.42361: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867444.42375: Set connection var ansible_pipelining to False 25039 1726867444.42412: variable 'ansible_shell_executable' from source: unknown 25039 1726867444.42420: variable 'ansible_connection' from source: unknown 25039 1726867444.42487: variable 'ansible_module_compression' from source: unknown 25039 1726867444.42490: variable 'ansible_shell_type' from source: unknown 25039 1726867444.42493: variable 'ansible_shell_executable' from source: unknown 25039 1726867444.42498: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867444.42500: variable 'ansible_pipelining' from source: unknown 25039 1726867444.42502: variable 'ansible_timeout' from source: unknown 25039 1726867444.42504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867444.42624: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867444.42640: variable 'omit' from source: magic vars 25039 1726867444.42650: starting attempt loop 25039 1726867444.42657: running the handler 25039 1726867444.42716: _low_level_execute_command(): starting 25039 1726867444.42736: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867444.43595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867444.43657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867444.43721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867444.45397: stdout chunk (state=3): >>>/root <<< 25039 1726867444.45538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867444.45555: stdout chunk (state=3): >>><<< 25039 1726867444.45573: stderr chunk (state=3): >>><<< 25039 1726867444.45690: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867444.45701: _low_level_execute_command(): starting 25039 1726867444.45704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619 `" && echo ansible-tmp-1726867444.456027-25164-154692999339619="` echo /root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619 `" ) && sleep 0' 25039 1726867444.46257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867444.46272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867444.46332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867444.46404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867444.46456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867444.46615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867444.48449: stdout chunk (state=3): >>>ansible-tmp-1726867444.456027-25164-154692999339619=/root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619 <<< 25039 1726867444.48606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867444.48609: stdout chunk (state=3): >>><<< 25039 1726867444.48612: stderr chunk (state=3): >>><<< 25039 1726867444.48783: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867444.456027-25164-154692999339619=/root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867444.48787: variable 'ansible_module_compression' from source: unknown 25039 1726867444.48789: ANSIBALLZ: Using lock for stat 25039 1726867444.48791: ANSIBALLZ: Acquiring lock 25039 1726867444.48793: ANSIBALLZ: Lock acquired: 140682442528704 25039 1726867444.48794: ANSIBALLZ: Creating module 25039 1726867444.70845: ANSIBALLZ: Writing module into payload 25039 1726867444.71093: ANSIBALLZ: Writing module 25039 1726867444.71097: ANSIBALLZ: Renaming module 25039 1726867444.71108: ANSIBALLZ: Done creating module 25039 1726867444.71131: variable 'ansible_facts' from source: unknown 25039 1726867444.71331: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/AnsiballZ_stat.py 25039 1726867444.71542: Sending initial data 25039 1726867444.71592: Sent initial data (152 bytes) 25039 1726867444.72493: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867444.72544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867444.72547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867444.72690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867444.75047: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25039 1726867444.75051: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867444.75091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867444.75247: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp3uspywty /root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/AnsiballZ_stat.py <<< 25039 1726867444.75271: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/AnsiballZ_stat.py" <<< 25039 1726867444.75275: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp3uspywty" to remote "/root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/AnsiballZ_stat.py" <<< 25039 1726867444.76492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867444.76543: stderr chunk (state=3): >>><<< 25039 1726867444.76552: stdout chunk (state=3): >>><<< 25039 1726867444.76680: done transferring module to remote 25039 1726867444.76683: _low_level_execute_command(): starting 25039 1726867444.76686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/ /root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/AnsiballZ_stat.py && sleep 0' 25039 1726867444.77817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867444.77820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867444.77822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867444.77824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867444.77831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867444.77833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867444.77960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867444.78129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867444.78180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867444.80832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867444.81067: stderr chunk (state=3): >>><<< 25039 1726867444.81070: stdout chunk (state=3): >>><<< 25039 1726867444.81073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25039 1726867444.81076: _low_level_execute_command(): starting 25039 1726867444.81080: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/AnsiballZ_stat.py && sleep 0' 25039 1726867444.82159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867444.82173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867444.82188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867444.82343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867444.82504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867444.82556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867444.85563: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 25039 1726867444.85620: stdout chunk (state=3): >>>import _imp # builtin <<< 25039 1726867444.85658: stdout chunk (state=3): >>>import '_thread' # <<< 25039 1726867444.85686: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 25039 1726867444.85784: stdout chunk (state=3): >>>import '_io' # <<< 25039 1726867444.85837: stdout chunk (state=3): >>>import 'marshal' # <<< 25039 1726867444.85950: stdout chunk (state=3): >>>import 'posix' # <<< 25039 1726867444.86068: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 25039 1726867444.86097: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 25039 1726867444.86135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 25039 1726867444.86167: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8d104d0> <<< 25039 1726867444.86188: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8cdfb30> <<< 25039 1726867444.86213: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 25039 1726867444.86230: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8d12a50> <<< 25039 1726867444.86285: stdout chunk (state=3): >>>import '_signal' # <<< 25039 1726867444.86309: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 25039 1726867444.86375: stdout chunk (state=3): >>>import 'io' # <<< 25039 1726867444.86392: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 25039 1726867444.86509: stdout chunk (state=3): >>>import '_collections_abc' # <<< 25039 1726867444.86553: stdout chunk (state=3): >>>import 'genericpath' # <<< 25039 1726867444.86561: stdout chunk (state=3): >>>import 'posixpath' # <<< 25039 1726867444.86607: stdout chunk (state=3): >>>import 'os' # <<< 25039 1726867444.86835: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8ae5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 25039 1726867444.86838: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 25039 1726867444.86851: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8ae5fa0> import 'site' # <<< 25039 1726867444.86883: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25039 1726867444.87347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 25039 1726867444.87372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 25039 1726867444.87387: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 25039 1726867444.87428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 25039 1726867444.87462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 25039 1726867444.87523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b23e90> <<< 25039 1726867444.87550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 25039 1726867444.87795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b23f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 25039 1726867444.87817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b5b830> <<< 25039 1726867444.87859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 25039 1726867444.87940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 25039 1726867444.87944: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b5bec0> <<< 25039 1726867444.87955: stdout chunk (state=3): >>>import '_collections' # <<< 25039 1726867444.88013: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b3bb60> import '_functools' # <<< 25039 1726867444.88320: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b39280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b21040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 25039 1726867444.88346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 25039 1726867444.88366: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 25039 1726867444.88392: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b7b7d0> <<< 25039 1726867444.88412: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b7a3f0> <<< 25039 1726867444.88575: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 25039 1726867444.88579: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b3a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b78c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb0860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b202c0> <<< 25039 1726867444.88602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 25039 1726867444.88635: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.88673: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bb0d10> <<< 25039 1726867444.88687: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb0bc0> <<< 25039 1726867444.88714: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.88785: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bb0f80> <<< 25039 1726867444.88813: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b1ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 25039 1726867444.88946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb1610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb12e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 25039 1726867444.88984: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb2510> <<< 25039 1726867444.89034: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 25039 1726867444.89044: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 25039 1726867444.89117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 25039 1726867444.89144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 25039 1726867444.89203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 25039 1726867444.89220: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bc8710> <<< 25039 1726867444.89240: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.89457: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bc9df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bcac90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bcb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bca1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bcbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bcb4a0> <<< 25039 1726867444.89513: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb2540> <<< 25039 1726867444.89807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 25039 1726867444.89810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 25039 1726867444.89859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8953bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d897c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d897c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 25039 1726867444.89890: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.90073: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.90094: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d897d010> <<< 25039 1726867444.90241: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.90266: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d897da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897c8c0> <<< 25039 1726867444.90302: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8951d90> <<< 25039 1726867444.90330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 25039 1726867444.90402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 25039 1726867444.90422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 25039 1726867444.90445: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897ee10> <<< 25039 1726867444.90500: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb2c30> <<< 25039 1726867444.90532: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 25039 1726867444.90617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 25039 1726867444.90856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d89a71a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 25039 1726867444.90920: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d89cb530> <<< 25039 1726867444.90953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 25039 1726867444.91026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 25039 1726867444.91114: stdout chunk (state=3): >>>import 'ntpath' # <<< 25039 1726867444.91255: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8a2c290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 25039 1726867444.91269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 25039 1726867444.91298: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 25039 1726867444.91326: stdout chunk (state=3): >>> <<< 25039 1726867444.91381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 25039 1726867444.91392: stdout chunk (state=3): >>> <<< 25039 1726867444.91521: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8a2e9f0><<< 25039 1726867444.91557: stdout chunk (state=3): >>> <<< 25039 1726867444.91661: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8a2c3b0><<< 25039 1726867444.91692: stdout chunk (state=3): >>> <<< 25039 1726867444.91818: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d89f12e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 25039 1726867444.91822: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83293a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d89ca360><<< 25039 1726867444.91824: stdout chunk (state=3): >>> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897fd70><<< 25039 1726867444.92004: stdout chunk (state=3): >>> <<< 25039 1726867444.92007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 25039 1726867444.92044: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc5d8329640> <<< 25039 1726867444.92259: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_13vb3qzm/ansible_stat_payload.zip' <<< 25039 1726867444.92319: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867444.92549: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 25039 1726867444.92583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 25039 1726867444.92671: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 25039 1726867444.92675: stdout chunk (state=3): >>> <<< 25039 1726867444.92835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 25039 1726867444.92839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 25039 1726867444.92874: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d837f080> import '_typing' # <<< 25039 1726867444.93052: stdout chunk (state=3): >>> <<< 25039 1726867444.93172: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d835df70> <<< 25039 1726867444.93215: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d835d100> <<< 25039 1726867444.93338: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867444.93343: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 25039 1726867444.93391: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867444.93453: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 25039 1726867444.93468: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867444.95662: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867444.95675: stdout chunk (state=3): >>> <<< 25039 1726867444.97479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py<<< 25039 1726867444.97496: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc'<<< 25039 1726867444.97500: stdout chunk (state=3): >>> import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d837cf50><<< 25039 1726867444.97537: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 25039 1726867444.97559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 25039 1726867444.97599: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py<<< 25039 1726867444.97604: stdout chunk (state=3): >>> <<< 25039 1726867444.97616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc'<<< 25039 1726867444.97649: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 25039 1726867444.97665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 25039 1726867444.97675: stdout chunk (state=3): >>> <<< 25039 1726867444.97714: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d83a6a80><<< 25039 1726867444.97771: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83a6810><<< 25039 1726867444.97786: stdout chunk (state=3): >>> <<< 25039 1726867444.97828: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83a6120> <<< 25039 1726867444.97857: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py<<< 25039 1726867444.97882: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 25039 1726867444.97937: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83a6570><<< 25039 1726867444.98059: stdout chunk (state=3): >>> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d837fd10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d83a77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d83a7a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 25039 1726867444.98138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 25039 1726867444.98166: stdout chunk (state=3): >>>import '_locale' # <<< 25039 1726867444.98226: stdout chunk (state=3): >>> import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83a7f50><<< 25039 1726867444.98258: stdout chunk (state=3): >>> import 'pwd' # <<< 25039 1726867444.98298: stdout chunk (state=3): >>> <<< 25039 1726867444.98301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py<<< 25039 1726867444.98303: stdout chunk (state=3): >>> <<< 25039 1726867444.98399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8211d90> <<< 25039 1726867444.98447: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.98461: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.98506: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d82139b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 25039 1726867444.98535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 25039 1726867444.98603: stdout chunk (state=3): >>> <<< 25039 1726867444.98618: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8214380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 25039 1726867444.98629: stdout chunk (state=3): >>> <<< 25039 1726867444.98687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8215520><<< 25039 1726867444.98710: stdout chunk (state=3): >>> <<< 25039 1726867444.98756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 25039 1726867444.98788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 25039 1726867444.98839: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 25039 1726867444.98850: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 25039 1726867444.99014: stdout chunk (state=3): >>> import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8217f80> <<< 25039 1726867444.99233: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d821c2c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8216270> <<< 25039 1726867444.99237: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 25039 1726867444.99261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 25039 1726867444.99309: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d821ffe0><<< 25039 1726867444.99346: stdout chunk (state=3): >>> import '_tokenize' # <<< 25039 1726867444.99451: stdout chunk (state=3): >>> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d821eab0><<< 25039 1726867444.99479: stdout chunk (state=3): >>> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d821e810> <<< 25039 1726867444.99507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 25039 1726867444.99558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 25039 1726867444.99662: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d821ed80><<< 25039 1726867444.99670: stdout chunk (state=3): >>> <<< 25039 1726867444.99698: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8216750><<< 25039 1726867444.99738: stdout chunk (state=3): >>> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867444.99782: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867444.99817: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8268230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 25039 1726867444.99850: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8268410><<< 25039 1726867444.99854: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 25039 1726867444.99922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 25039 1726867444.99926: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 25039 1726867444.99998: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8269e80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8269c10><<< 25039 1726867445.00014: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py<<< 25039 1726867445.00200: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 25039 1726867445.00258: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867445.00261: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d826c320><<< 25039 1726867445.00308: stdout chunk (state=3): >>> <<< 25039 1726867445.00351: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d826a480> <<< 25039 1726867445.00354: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 25039 1726867445.00416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 25039 1726867445.00455: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 25039 1726867445.00457: stdout chunk (state=3): >>> <<< 25039 1726867445.00487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 25039 1726867445.00497: stdout chunk (state=3): >>> import '_string' # <<< 25039 1726867445.00576: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d826faa0> <<< 25039 1726867445.00783: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d826c470> <<< 25039 1726867445.00867: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867445.00885: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867445.00894: stdout chunk (state=3): >>> <<< 25039 1726867445.00938: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d82708f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867445.00959: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867445.01050: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8270920> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867445.01066: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8270e00> <<< 25039 1726867445.01119: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8268590><<< 25039 1726867445.01147: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 25039 1726867445.01176: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 25039 1726867445.01254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867445.01302: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d82fc500><<< 25039 1726867445.01458: stdout chunk (state=3): >>> <<< 25039 1726867445.01583: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867445.01618: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d82fd3a0> <<< 25039 1726867445.01671: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8272c90> <<< 25039 1726867445.01731: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867445.01735: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8270da0> <<< 25039 1726867445.01767: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d82728a0> <<< 25039 1726867445.01799: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.01828: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.01839: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 25039 1726867445.01978: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.02113: stdout chunk (state=3): >>> # zipimport: zlib available<<< 25039 1726867445.02143: stdout chunk (state=3): >>> # zipimport: zlib available<<< 25039 1726867445.02180: stdout chunk (state=3): >>> import 'ansible.module_utils.common' # # zipimport: zlib available <<< 25039 1726867445.02237: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 25039 1726867445.02267: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.02456: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.02470: stdout chunk (state=3): >>> <<< 25039 1726867445.02788: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.03655: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.04485: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 25039 1726867445.04511: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 25039 1726867445.04533: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 25039 1726867445.04558: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 25039 1726867445.04622: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 25039 1726867445.04696: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867445.04828: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d81016a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 25039 1726867445.04839: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc'<<< 25039 1726867445.04848: stdout chunk (state=3): >>> <<< 25039 1726867445.04880: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8102480><<< 25039 1726867445.04886: stdout chunk (state=3): >>> <<< 25039 1726867445.04915: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d82fd4c0><<< 25039 1726867445.04918: stdout chunk (state=3): >>> <<< 25039 1726867445.04984: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 25039 1726867445.04988: stdout chunk (state=3): >>> <<< 25039 1726867445.05012: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.05020: stdout chunk (state=3): >>> <<< 25039 1726867445.05052: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.05081: stdout chunk (state=3): >>> import 'ansible.module_utils._text' # <<< 25039 1726867445.05282: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.05388: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.05606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py<<< 25039 1726867445.05653: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8102510> <<< 25039 1726867445.05674: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.06439: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.06445: stdout chunk (state=3): >>> <<< 25039 1726867445.07185: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.07199: stdout chunk (state=3): >>> <<< 25039 1726867445.07304: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.07318: stdout chunk (state=3): >>> <<< 25039 1726867445.07424: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 25039 1726867445.07429: stdout chunk (state=3): >>> <<< 25039 1726867445.07446: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.07506: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867445.07564: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 25039 1726867445.07595: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.07824: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 25039 1726867445.07859: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.07864: stdout chunk (state=3): >>> <<< 25039 1726867445.07890: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.07917: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 25039 1726867445.07923: stdout chunk (state=3): >>> <<< 25039 1726867445.07951: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.08029: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.08090: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 25039 1726867445.08125: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.08349: stdout chunk (state=3): >>> <<< 25039 1726867445.08510: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.08907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 25039 1726867445.08913: stdout chunk (state=3): >>> <<< 25039 1726867445.09006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 25039 1726867445.09040: stdout chunk (state=3): >>>import '_ast' # <<< 25039 1726867445.09146: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8103710><<< 25039 1726867445.09151: stdout chunk (state=3): >>> <<< 25039 1726867445.09175: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.09295: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.09402: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 25039 1726867445.09427: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 25039 1726867445.09448: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 25039 1726867445.09469: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 25039 1726867445.09505: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.09578: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.09640: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 25039 1726867445.09668: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.09741: stdout chunk (state=3): >>> # zipimport: zlib available<<< 25039 1726867445.09746: stdout chunk (state=3): >>> <<< 25039 1726867445.09816: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.09821: stdout chunk (state=3): >>> <<< 25039 1726867445.09918: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.09924: stdout chunk (state=3): >>> <<< 25039 1726867445.10035: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 25039 1726867445.10040: stdout chunk (state=3): >>> <<< 25039 1726867445.10222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 25039 1726867445.10238: stdout chunk (state=3): >>> <<< 25039 1726867445.10249: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 25039 1726867445.10259: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d810e1b0><<< 25039 1726867445.10266: stdout chunk (state=3): >>> <<< 25039 1726867445.10326: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8109970><<< 25039 1726867445.10335: stdout chunk (state=3): >>> <<< 25039 1726867445.10375: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 25039 1726867445.10389: stdout chunk (state=3): >>> <<< 25039 1726867445.10402: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 25039 1726867445.10420: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.10426: stdout chunk (state=3): >>> <<< 25039 1726867445.10522: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.10626: stdout chunk (state=3): >>> # zipimport: zlib available <<< 25039 1726867445.10683: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.10746: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 25039 1726867445.10960: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 25039 1726867445.10964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 25039 1726867445.11071: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83fab10> <<< 25039 1726867445.11265: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83ee7e0> <<< 25039 1726867445.11270: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d810e2d0> <<< 25039 1726867445.11316: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8103110> <<< 25039 1726867445.11319: stdout chunk (state=3): >>># destroy ansible.module_utils.distro <<< 25039 1726867445.11322: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 25039 1726867445.11323: stdout chunk (state=3): >>> <<< 25039 1726867445.11349: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.11354: stdout chunk (state=3): >>> <<< 25039 1726867445.11400: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.11405: stdout chunk (state=3): >>> <<< 25039 1726867445.11445: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 25039 1726867445.11451: stdout chunk (state=3): >>> <<< 25039 1726867445.11462: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 25039 1726867445.11537: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # <<< 25039 1726867445.11567: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.11604: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.11617: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 25039 1726867445.11645: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.11859: stdout chunk (state=3): >>># zipimport: zlib available<<< 25039 1726867445.12046: stdout chunk (state=3): >>> <<< 25039 1726867445.12168: stdout chunk (state=3): >>># zipimport: zlib available <<< 25039 1726867445.12326: stdout chunk (state=3): >>> <<< 25039 1726867445.12337: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}<<< 25039 1726867445.12344: stdout chunk (state=3): >>> <<< 25039 1726867445.12375: stdout chunk (state=3): >>># destroy __main__<<< 25039 1726867445.12383: stdout chunk (state=3): >>> <<< 25039 1726867445.12858: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 25039 1726867445.12875: stdout chunk (state=3): >>> <<< 25039 1726867445.12881: stdout chunk (state=3): >>># clear sys.path_hooks<<< 25039 1726867445.12900: stdout chunk (state=3): >>> <<< 25039 1726867445.12904: stdout chunk (state=3): >>># clear builtins._ # clear sys.path<<< 25039 1726867445.12932: stdout chunk (state=3): >>> # clear sys.argv <<< 25039 1726867445.12953: stdout chunk (state=3): >>># clear sys.ps1 <<< 25039 1726867445.12957: stdout chunk (state=3): >>># clear sys.ps2<<< 25039 1726867445.12969: stdout chunk (state=3): >>> <<< 25039 1726867445.12989: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type<<< 25039 1726867445.12993: stdout chunk (state=3): >>> # clear sys.last_value<<< 25039 1726867445.13013: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin<<< 25039 1726867445.13038: stdout chunk (state=3): >>> # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal<<< 25039 1726867445.13062: stdout chunk (state=3): >>> # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io <<< 25039 1726867445.13095: stdout chunk (state=3): >>># cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools <<< 25039 1726867445.13124: stdout chunk (state=3): >>># cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re <<< 25039 1726867445.13156: stdout chunk (state=3): >>># cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib<<< 25039 1726867445.13183: stdout chunk (state=3): >>> # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random<<< 25039 1726867445.13211: stdout chunk (state=3): >>> # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib<<< 25039 1726867445.13230: stdout chunk (state=3): >>> # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils<<< 25039 1726867445.13264: stdout chunk (state=3): >>> # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform<<< 25039 1726867445.13289: stdout chunk (state=3): >>> # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd<<< 25039 1726867445.13314: stdout chunk (state=3): >>> # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket<<< 25039 1726867445.13337: stdout chunk (state=3): >>> # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six<<< 25039 1726867445.13365: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy<<< 25039 1726867445.13388: stdout chunk (state=3): >>> # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast <<< 25039 1726867445.13414: stdout chunk (state=3): >>># destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux<<< 25039 1726867445.13438: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info<<< 25039 1726867445.13651: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 25039 1726867445.13751: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 25039 1726867445.13784: stdout chunk (state=3): >>># destroy importlib.machinery <<< 25039 1726867445.13800: stdout chunk (state=3): >>># destroy importlib._abc <<< 25039 1726867445.13810: stdout chunk (state=3): >>># destroy importlib.util <<< 25039 1726867445.13833: stdout chunk (state=3): >>># destroy _bz2<<< 25039 1726867445.13843: stdout chunk (state=3): >>> # destroy _compression<<< 25039 1726867445.13862: stdout chunk (state=3): >>> # destroy _lzma<<< 25039 1726867445.13875: stdout chunk (state=3): >>> <<< 25039 1726867445.13899: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii<<< 25039 1726867445.13926: stdout chunk (state=3): >>> # destroy struct <<< 25039 1726867445.13939: stdout chunk (state=3): >>># destroy zlib # destroy bz2<<< 25039 1726867445.13956: stdout chunk (state=3): >>> # destroy lzma<<< 25039 1726867445.13965: stdout chunk (state=3): >>> # destroy zipfile._path<<< 25039 1726867445.13983: stdout chunk (state=3): >>> <<< 25039 1726867445.14010: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib<<< 25039 1726867445.14015: stdout chunk (state=3): >>> # destroy zipfile._path.glob<<< 25039 1726867445.14034: stdout chunk (state=3): >>> # destroy fnmatch<<< 25039 1726867445.14074: stdout chunk (state=3): >>> # destroy ipaddress # destroy ntpath<<< 25039 1726867445.14088: stdout chunk (state=3): >>> <<< 25039 1726867445.14100: stdout chunk (state=3): >>># destroy importlib # destroy zipimport<<< 25039 1726867445.14121: stdout chunk (state=3): >>> # destroy __main__<<< 25039 1726867445.14126: stdout chunk (state=3): >>> # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib<<< 25039 1726867445.14154: stdout chunk (state=3): >>> # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json<<< 25039 1726867445.14179: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale<<< 25039 1726867445.14188: stdout chunk (state=3): >>> # destroy pwd<<< 25039 1726867445.14198: stdout chunk (state=3): >>> # destroy locale<<< 25039 1726867445.14222: stdout chunk (state=3): >>> # destroy signal # destroy fcntl # destroy select # destroy _signal<<< 25039 1726867445.14234: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog<<< 25039 1726867445.14268: stdout chunk (state=3): >>> # destroy uuid # destroy selectors <<< 25039 1726867445.14289: stdout chunk (state=3): >>># destroy errno # destroy array<<< 25039 1726867445.14294: stdout chunk (state=3): >>> # destroy datetime<<< 25039 1726867445.14332: stdout chunk (state=3): >>> # destroy selinux<<< 25039 1726867445.14348: stdout chunk (state=3): >>> # destroy shutil <<< 25039 1726867445.14368: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json<<< 25039 1726867445.14431: stdout chunk (state=3): >>> # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux<<< 25039 1726867445.14447: stdout chunk (state=3): >>> <<< 25039 1726867445.14455: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian<<< 25039 1726867445.14463: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes<<< 25039 1726867445.14491: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 25039 1726867445.14496: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime<<< 25039 1726867445.14527: stdout chunk (state=3): >>> # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 25039 1726867445.14537: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 25039 1726867445.14561: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 25039 1726867445.14576: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 25039 1726867445.14600: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap<<< 25039 1726867445.14605: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 25039 1726867445.14633: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 25039 1726867445.14639: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 25039 1726867445.14660: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections <<< 25039 1726867445.14682: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 25039 1726867445.14705: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat<<< 25039 1726867445.14708: stdout chunk (state=3): >>> # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time<<< 25039 1726867445.14732: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 25039 1726867445.14751: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 25039 1726867445.14772: stdout chunk (state=3): >>> # cleanup[3] wiping builtins<<< 25039 1726867445.14783: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader<<< 25039 1726867445.14951: stdout chunk (state=3): >>> # destroy systemd._journal # destroy _datetime <<< 25039 1726867445.14972: stdout chunk (state=3): >>># destroy sys.monitoring<<< 25039 1726867445.14981: stdout chunk (state=3): >>> <<< 25039 1726867445.14999: stdout chunk (state=3): >>># destroy _socket <<< 25039 1726867445.15030: stdout chunk (state=3): >>># destroy _collections<<< 25039 1726867445.15035: stdout chunk (state=3): >>> <<< 25039 1726867445.15065: stdout chunk (state=3): >>># destroy platform<<< 25039 1726867445.15073: stdout chunk (state=3): >>> <<< 25039 1726867445.15100: stdout chunk (state=3): >>># destroy _uuid # destroy stat<<< 25039 1726867445.15118: stdout chunk (state=3): >>> <<< 25039 1726867445.15124: stdout chunk (state=3): >>># destroy genericpath <<< 25039 1726867445.15154: stdout chunk (state=3): >>># destroy re._parser <<< 25039 1726867445.15158: stdout chunk (state=3): >>># destroy tokenize<<< 25039 1726867445.15163: stdout chunk (state=3): >>> <<< 25039 1726867445.15194: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib<<< 25039 1726867445.15202: stdout chunk (state=3): >>> <<< 25039 1726867445.15207: stdout chunk (state=3): >>># destroy copyreg<<< 25039 1726867445.15257: stdout chunk (state=3): >>> # destroy contextlib # destroy _typing <<< 25039 1726867445.15274: stdout chunk (state=3): >>># destroy _tokenize <<< 25039 1726867445.15293: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser<<< 25039 1726867445.15315: stdout chunk (state=3): >>> # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 25039 1726867445.15462: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs <<< 25039 1726867445.15466: stdout chunk (state=3): >>># destroy encodings.aliases <<< 25039 1726867445.15494: stdout chunk (state=3): >>># destroy encodings.utf_8<<< 25039 1726867445.15511: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig <<< 25039 1726867445.15516: stdout chunk (state=3): >>># destroy encodings.cp437 <<< 25039 1726867445.15536: stdout chunk (state=3): >>># destroy _codecs <<< 25039 1726867445.15552: stdout chunk (state=3): >>># destroy io<<< 25039 1726867445.15555: stdout chunk (state=3): >>> # destroy traceback<<< 25039 1726867445.15576: stdout chunk (state=3): >>> <<< 25039 1726867445.15583: stdout chunk (state=3): >>># destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 25039 1726867445.15601: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect <<< 25039 1726867445.15638: stdout chunk (state=3): >>># destroy time # destroy _random<<< 25039 1726867445.15645: stdout chunk (state=3): >>> <<< 25039 1726867445.15650: stdout chunk (state=3): >>># destroy _weakref<<< 25039 1726867445.15676: stdout chunk (state=3): >>> # destroy _hashlib <<< 25039 1726867445.15700: stdout chunk (state=3): >>># destroy _operator<<< 25039 1726867445.15721: stdout chunk (state=3): >>> # destroy _string # destroy re # destroy itertools<<< 25039 1726867445.15742: stdout chunk (state=3): >>> # destroy _abc <<< 25039 1726867445.15765: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 25039 1726867445.15780: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 25039 1726867445.15949: stdout chunk (state=3): >>> <<< 25039 1726867445.16255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867445.16285: stderr chunk (state=3): >>><<< 25039 1726867445.16288: stdout chunk (state=3): >>><<< 25039 1726867445.16351: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8d104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8cdfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8d12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8ae5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8ae5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b23e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b23f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b5b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b5bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b3bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b39280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b21040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b7b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b7a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b3a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b78c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb0860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b202c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bb0d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb0bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bb0f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8b1ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb1610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb12e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb2510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bc8710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bc9df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bcac90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bcb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bca1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8bcbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bcb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb2540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8953bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d897c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d897c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d897d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d897da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8951d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8bb2c30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d89a71a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d89cb530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8a2c290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8a2e9f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8a2c3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d89f12e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83293a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d89ca360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d897fd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc5d8329640> # zipimport: found 30 names in '/tmp/ansible_stat_payload_13vb3qzm/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d837f080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d835df70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d835d100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d837cf50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d83a6a80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83a6810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83a6120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83a6570> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d837fd10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d83a77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d83a7a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83a7f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8211d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d82139b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8214380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8215520> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8217f80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d821c2c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8216270> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d821ffe0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d821eab0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d821e810> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d821ed80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8216750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8268230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8268410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8269e80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8269c10> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d826c320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d826a480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d826faa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d826c470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d82708f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8270920> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8270e00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8268590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d82fc500> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d82fd3a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8272c90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d8270da0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d82728a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d81016a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8102480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d82fd4c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8102510> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8103710> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc5d810e1b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8109970> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83fab10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d83ee7e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d810e2d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc5d8103110> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 25039 1726867445.16875: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867445.16880: _low_level_execute_command(): starting 25039 1726867445.16882: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867444.456027-25164-154692999339619/ > /dev/null 2>&1 && sleep 0' 25039 1726867445.17014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867445.17025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867445.17028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867445.17030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867445.17032: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867445.17073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867445.17076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867445.17085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867445.17137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867445.19630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867445.19655: stderr chunk (state=3): >>><<< 25039 1726867445.19658: stdout chunk (state=3): >>><<< 25039 1726867445.19673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25039 1726867445.19681: handler run complete 25039 1726867445.19695: attempt loop complete, returning result 25039 1726867445.19698: _execute() done 25039 1726867445.19702: dumping result to json 25039 1726867445.19705: done dumping result, returning 25039 1726867445.19719: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affcac9-a3a5-3ddc-7272-0000000000cc] 25039 1726867445.19721: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000cc 25039 1726867445.19801: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000cc 25039 1726867445.19803: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 25039 1726867445.19861: no more pending results, returning what we have 25039 1726867445.19865: results queue empty 25039 1726867445.19865: checking for any_errors_fatal 25039 1726867445.19870: done checking for any_errors_fatal 25039 1726867445.19871: checking for max_fail_percentage 25039 1726867445.19872: done checking for max_fail_percentage 25039 1726867445.19873: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.19874: done checking to see if all hosts have failed 25039 1726867445.19874: getting the remaining hosts for this loop 25039 1726867445.19875: done getting the remaining hosts for this loop 25039 1726867445.19881: getting the next task for host managed_node1 25039 1726867445.19886: done getting next task for host managed_node1 25039 1726867445.19888: ^ task is: TASK: Set flag to indicate system is ostree 25039 1726867445.19890: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.19893: getting variables 25039 1726867445.19894: in VariableManager get_vars() 25039 1726867445.19924: Calling all_inventory to load vars for managed_node1 25039 1726867445.19926: Calling groups_inventory to load vars for managed_node1 25039 1726867445.19930: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.19940: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.19943: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.19945: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.20114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.20250: done with get_vars() 25039 1726867445.20257: done getting variables 25039 1726867445.20331: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:24:05 -0400 (0:00:00.807) 0:00:02.729 ****** 25039 1726867445.20351: entering _queue_task() for managed_node1/set_fact 25039 1726867445.20352: Creating lock for set_fact 25039 1726867445.20561: worker is 1 (out of 1 available) 25039 1726867445.20574: exiting _queue_task() for managed_node1/set_fact 25039 1726867445.20587: done queuing things up, now waiting for results queue to drain 25039 1726867445.20589: waiting for pending results... 25039 1726867445.20731: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 25039 1726867445.20796: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000cd 25039 1726867445.20804: variable 'ansible_search_path' from source: unknown 25039 1726867445.20809: variable 'ansible_search_path' from source: unknown 25039 1726867445.20839: calling self._execute() 25039 1726867445.20891: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.20896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.20905: variable 'omit' from source: magic vars 25039 1726867445.21243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867445.21419: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867445.21451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867445.21480: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867445.21505: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867445.21589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867445.21607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867445.21627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867445.21644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867445.21737: Evaluated conditional (not __network_is_ostree is defined): True 25039 1726867445.21740: variable 'omit' from source: magic vars 25039 1726867445.21764: variable 'omit' from source: magic vars 25039 1726867445.21847: variable '__ostree_booted_stat' from source: set_fact 25039 1726867445.21884: variable 'omit' from source: magic vars 25039 1726867445.21905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867445.21932: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867445.21945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867445.21958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867445.21967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867445.21991: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867445.21994: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.21997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.22066: Set connection var ansible_shell_executable to /bin/sh 25039 1726867445.22070: Set connection var ansible_timeout to 10 25039 1726867445.22076: Set connection var ansible_shell_type to sh 25039 1726867445.22080: Set connection var ansible_connection to ssh 25039 1726867445.22087: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867445.22091: Set connection var ansible_pipelining to False 25039 1726867445.22109: variable 'ansible_shell_executable' from source: unknown 25039 1726867445.22117: variable 'ansible_connection' from source: unknown 25039 1726867445.22120: variable 'ansible_module_compression' from source: unknown 25039 1726867445.22124: variable 'ansible_shell_type' from source: unknown 25039 1726867445.22126: variable 'ansible_shell_executable' from source: unknown 25039 1726867445.22129: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.22131: variable 'ansible_pipelining' from source: unknown 25039 1726867445.22133: variable 'ansible_timeout' from source: unknown 25039 1726867445.22135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.22202: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867445.22211: variable 'omit' from source: magic vars 25039 1726867445.22217: starting attempt loop 25039 1726867445.22221: running the handler 25039 1726867445.22230: handler run complete 25039 1726867445.22238: attempt loop complete, returning result 25039 1726867445.22240: _execute() done 25039 1726867445.22242: dumping result to json 25039 1726867445.22247: done dumping result, returning 25039 1726867445.22257: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-3ddc-7272-0000000000cd] 25039 1726867445.22259: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000cd 25039 1726867445.22325: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000cd 25039 1726867445.22328: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 25039 1726867445.22409: no more pending results, returning what we have 25039 1726867445.22412: results queue empty 25039 1726867445.22413: checking for any_errors_fatal 25039 1726867445.22419: done checking for any_errors_fatal 25039 1726867445.22419: checking for max_fail_percentage 25039 1726867445.22421: done checking for max_fail_percentage 25039 1726867445.22421: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.22422: done checking to see if all hosts have failed 25039 1726867445.22423: getting the remaining hosts for this loop 25039 1726867445.22424: done getting the remaining hosts for this loop 25039 1726867445.22427: getting the next task for host managed_node1 25039 1726867445.22434: done getting next task for host managed_node1 25039 1726867445.22437: ^ task is: TASK: Fix CentOS6 Base repo 25039 1726867445.22440: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.22444: getting variables 25039 1726867445.22445: in VariableManager get_vars() 25039 1726867445.22469: Calling all_inventory to load vars for managed_node1 25039 1726867445.22472: Calling groups_inventory to load vars for managed_node1 25039 1726867445.22474: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.22484: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.22486: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.22494: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.22612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.22727: done with get_vars() 25039 1726867445.22734: done getting variables 25039 1726867445.22822: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:24:05 -0400 (0:00:00.024) 0:00:02.754 ****** 25039 1726867445.22842: entering _queue_task() for managed_node1/copy 25039 1726867445.23040: worker is 1 (out of 1 available) 25039 1726867445.23052: exiting _queue_task() for managed_node1/copy 25039 1726867445.23062: done queuing things up, now waiting for results queue to drain 25039 1726867445.23063: waiting for pending results... 25039 1726867445.23210: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 25039 1726867445.23261: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000cf 25039 1726867445.23271: variable 'ansible_search_path' from source: unknown 25039 1726867445.23274: variable 'ansible_search_path' from source: unknown 25039 1726867445.23304: calling self._execute() 25039 1726867445.23357: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.23362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.23370: variable 'omit' from source: magic vars 25039 1726867445.23698: variable 'ansible_distribution' from source: facts 25039 1726867445.23714: Evaluated conditional (ansible_distribution == 'CentOS'): True 25039 1726867445.23855: variable 'ansible_distribution_major_version' from source: facts 25039 1726867445.23860: Evaluated conditional (ansible_distribution_major_version == '6'): False 25039 1726867445.23863: when evaluation is False, skipping this task 25039 1726867445.23865: _execute() done 25039 1726867445.23868: dumping result to json 25039 1726867445.23872: done dumping result, returning 25039 1726867445.23880: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-3ddc-7272-0000000000cf] 25039 1726867445.23884: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000cf 25039 1726867445.23967: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000cf 25039 1726867445.23970: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 25039 1726867445.24016: no more pending results, returning what we have 25039 1726867445.24020: results queue empty 25039 1726867445.24021: checking for any_errors_fatal 25039 1726867445.24024: done checking for any_errors_fatal 25039 1726867445.24025: checking for max_fail_percentage 25039 1726867445.24026: done checking for max_fail_percentage 25039 1726867445.24027: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.24028: done checking to see if all hosts have failed 25039 1726867445.24029: getting the remaining hosts for this loop 25039 1726867445.24030: done getting the remaining hosts for this loop 25039 1726867445.24033: getting the next task for host managed_node1 25039 1726867445.24039: done getting next task for host managed_node1 25039 1726867445.24041: ^ task is: TASK: Include the task 'enable_epel.yml' 25039 1726867445.24044: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.24047: getting variables 25039 1726867445.24048: in VariableManager get_vars() 25039 1726867445.24072: Calling all_inventory to load vars for managed_node1 25039 1726867445.24075: Calling groups_inventory to load vars for managed_node1 25039 1726867445.24087: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.24096: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.24099: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.24101: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.24246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.24361: done with get_vars() 25039 1726867445.24367: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:24:05 -0400 (0:00:00.015) 0:00:02.769 ****** 25039 1726867445.24433: entering _queue_task() for managed_node1/include_tasks 25039 1726867445.24620: worker is 1 (out of 1 available) 25039 1726867445.24633: exiting _queue_task() for managed_node1/include_tasks 25039 1726867445.24644: done queuing things up, now waiting for results queue to drain 25039 1726867445.24645: waiting for pending results... 25039 1726867445.24790: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 25039 1726867445.24850: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000d0 25039 1726867445.24860: variable 'ansible_search_path' from source: unknown 25039 1726867445.24864: variable 'ansible_search_path' from source: unknown 25039 1726867445.24897: calling self._execute() 25039 1726867445.24951: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.24955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.24964: variable 'omit' from source: magic vars 25039 1726867445.25329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867445.26801: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867445.26851: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867445.26880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867445.26905: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867445.26937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867445.26998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867445.27020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867445.27037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867445.27069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867445.27082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867445.27165: variable '__network_is_ostree' from source: set_fact 25039 1726867445.27187: Evaluated conditional (not __network_is_ostree | d(false)): True 25039 1726867445.27192: _execute() done 25039 1726867445.27195: dumping result to json 25039 1726867445.27198: done dumping result, returning 25039 1726867445.27204: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-3ddc-7272-0000000000d0] 25039 1726867445.27209: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000d0 25039 1726867445.27295: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000d0 25039 1726867445.27298: WORKER PROCESS EXITING 25039 1726867445.27323: no more pending results, returning what we have 25039 1726867445.27328: in VariableManager get_vars() 25039 1726867445.27358: Calling all_inventory to load vars for managed_node1 25039 1726867445.27361: Calling groups_inventory to load vars for managed_node1 25039 1726867445.27364: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.27375: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.27379: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.27382: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.27546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.27663: done with get_vars() 25039 1726867445.27669: variable 'ansible_search_path' from source: unknown 25039 1726867445.27670: variable 'ansible_search_path' from source: unknown 25039 1726867445.27712: we have included files to process 25039 1726867445.27713: generating all_blocks data 25039 1726867445.27714: done generating all_blocks data 25039 1726867445.27719: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25039 1726867445.27720: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25039 1726867445.27722: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25039 1726867445.28192: done processing included file 25039 1726867445.28194: iterating over new_blocks loaded from include file 25039 1726867445.28195: in VariableManager get_vars() 25039 1726867445.28203: done with get_vars() 25039 1726867445.28204: filtering new block on tags 25039 1726867445.28220: done filtering new block on tags 25039 1726867445.28222: in VariableManager get_vars() 25039 1726867445.28228: done with get_vars() 25039 1726867445.28230: filtering new block on tags 25039 1726867445.28236: done filtering new block on tags 25039 1726867445.28237: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 25039 1726867445.28242: extending task lists for all hosts with included blocks 25039 1726867445.28303: done extending task lists 25039 1726867445.28304: done processing included files 25039 1726867445.28305: results queue empty 25039 1726867445.28305: checking for any_errors_fatal 25039 1726867445.28309: done checking for any_errors_fatal 25039 1726867445.28310: checking for max_fail_percentage 25039 1726867445.28310: done checking for max_fail_percentage 25039 1726867445.28311: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.28311: done checking to see if all hosts have failed 25039 1726867445.28312: getting the remaining hosts for this loop 25039 1726867445.28313: done getting the remaining hosts for this loop 25039 1726867445.28314: getting the next task for host managed_node1 25039 1726867445.28317: done getting next task for host managed_node1 25039 1726867445.28319: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 25039 1726867445.28320: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.28322: getting variables 25039 1726867445.28322: in VariableManager get_vars() 25039 1726867445.28328: Calling all_inventory to load vars for managed_node1 25039 1726867445.28330: Calling groups_inventory to load vars for managed_node1 25039 1726867445.28331: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.28335: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.28341: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.28343: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.28438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.28549: done with get_vars() 25039 1726867445.28555: done getting variables 25039 1726867445.28606: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 25039 1726867445.28750: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:24:05 -0400 (0:00:00.043) 0:00:02.813 ****** 25039 1726867445.28784: entering _queue_task() for managed_node1/command 25039 1726867445.28786: Creating lock for command 25039 1726867445.29027: worker is 1 (out of 1 available) 25039 1726867445.29040: exiting _queue_task() for managed_node1/command 25039 1726867445.29051: done queuing things up, now waiting for results queue to drain 25039 1726867445.29053: waiting for pending results... 25039 1726867445.29201: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 25039 1726867445.29263: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000ea 25039 1726867445.29274: variable 'ansible_search_path' from source: unknown 25039 1726867445.29291: variable 'ansible_search_path' from source: unknown 25039 1726867445.29312: calling self._execute() 25039 1726867445.29362: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.29367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.29375: variable 'omit' from source: magic vars 25039 1726867445.29650: variable 'ansible_distribution' from source: facts 25039 1726867445.29660: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25039 1726867445.29750: variable 'ansible_distribution_major_version' from source: facts 25039 1726867445.29753: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25039 1726867445.29757: when evaluation is False, skipping this task 25039 1726867445.29760: _execute() done 25039 1726867445.29765: dumping result to json 25039 1726867445.29768: done dumping result, returning 25039 1726867445.29774: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0affcac9-a3a5-3ddc-7272-0000000000ea] 25039 1726867445.29780: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ea 25039 1726867445.29872: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ea 25039 1726867445.29875: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25039 1726867445.29932: no more pending results, returning what we have 25039 1726867445.29935: results queue empty 25039 1726867445.29936: checking for any_errors_fatal 25039 1726867445.29938: done checking for any_errors_fatal 25039 1726867445.29938: checking for max_fail_percentage 25039 1726867445.29940: done checking for max_fail_percentage 25039 1726867445.29940: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.29941: done checking to see if all hosts have failed 25039 1726867445.29942: getting the remaining hosts for this loop 25039 1726867445.29943: done getting the remaining hosts for this loop 25039 1726867445.29946: getting the next task for host managed_node1 25039 1726867445.29953: done getting next task for host managed_node1 25039 1726867445.29955: ^ task is: TASK: Install yum-utils package 25039 1726867445.29959: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.29962: getting variables 25039 1726867445.29963: in VariableManager get_vars() 25039 1726867445.29995: Calling all_inventory to load vars for managed_node1 25039 1726867445.29997: Calling groups_inventory to load vars for managed_node1 25039 1726867445.30000: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.30012: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.30014: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.30017: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.30145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.30284: done with get_vars() 25039 1726867445.30292: done getting variables 25039 1726867445.30363: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:24:05 -0400 (0:00:00.015) 0:00:02.829 ****** 25039 1726867445.30384: entering _queue_task() for managed_node1/package 25039 1726867445.30386: Creating lock for package 25039 1726867445.30591: worker is 1 (out of 1 available) 25039 1726867445.30605: exiting _queue_task() for managed_node1/package 25039 1726867445.30615: done queuing things up, now waiting for results queue to drain 25039 1726867445.30617: waiting for pending results... 25039 1726867445.30766: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 25039 1726867445.30833: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000eb 25039 1726867445.30847: variable 'ansible_search_path' from source: unknown 25039 1726867445.30850: variable 'ansible_search_path' from source: unknown 25039 1726867445.30879: calling self._execute() 25039 1726867445.30933: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.30937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.30950: variable 'omit' from source: magic vars 25039 1726867445.31217: variable 'ansible_distribution' from source: facts 25039 1726867445.31226: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25039 1726867445.31318: variable 'ansible_distribution_major_version' from source: facts 25039 1726867445.31322: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25039 1726867445.31325: when evaluation is False, skipping this task 25039 1726867445.31328: _execute() done 25039 1726867445.31331: dumping result to json 25039 1726867445.31333: done dumping result, returning 25039 1726867445.31340: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affcac9-a3a5-3ddc-7272-0000000000eb] 25039 1726867445.31345: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000eb 25039 1726867445.31426: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000eb 25039 1726867445.31430: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25039 1726867445.31473: no more pending results, returning what we have 25039 1726867445.31479: results queue empty 25039 1726867445.31480: checking for any_errors_fatal 25039 1726867445.31487: done checking for any_errors_fatal 25039 1726867445.31488: checking for max_fail_percentage 25039 1726867445.31489: done checking for max_fail_percentage 25039 1726867445.31490: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.31491: done checking to see if all hosts have failed 25039 1726867445.31491: getting the remaining hosts for this loop 25039 1726867445.31493: done getting the remaining hosts for this loop 25039 1726867445.31495: getting the next task for host managed_node1 25039 1726867445.31501: done getting next task for host managed_node1 25039 1726867445.31503: ^ task is: TASK: Enable EPEL 7 25039 1726867445.31506: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.31509: getting variables 25039 1726867445.31510: in VariableManager get_vars() 25039 1726867445.31532: Calling all_inventory to load vars for managed_node1 25039 1726867445.31535: Calling groups_inventory to load vars for managed_node1 25039 1726867445.31537: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.31548: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.31550: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.31553: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.31670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.31788: done with get_vars() 25039 1726867445.31795: done getting variables 25039 1726867445.31839: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:24:05 -0400 (0:00:00.014) 0:00:02.844 ****** 25039 1726867445.31859: entering _queue_task() for managed_node1/command 25039 1726867445.32044: worker is 1 (out of 1 available) 25039 1726867445.32057: exiting _queue_task() for managed_node1/command 25039 1726867445.32068: done queuing things up, now waiting for results queue to drain 25039 1726867445.32069: waiting for pending results... 25039 1726867445.32214: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 25039 1726867445.32272: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000ec 25039 1726867445.32284: variable 'ansible_search_path' from source: unknown 25039 1726867445.32289: variable 'ansible_search_path' from source: unknown 25039 1726867445.32317: calling self._execute() 25039 1726867445.32368: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.32372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.32380: variable 'omit' from source: magic vars 25039 1726867445.32640: variable 'ansible_distribution' from source: facts 25039 1726867445.32650: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25039 1726867445.32735: variable 'ansible_distribution_major_version' from source: facts 25039 1726867445.32747: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25039 1726867445.32751: when evaluation is False, skipping this task 25039 1726867445.32753: _execute() done 25039 1726867445.32756: dumping result to json 25039 1726867445.32758: done dumping result, returning 25039 1726867445.32765: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affcac9-a3a5-3ddc-7272-0000000000ec] 25039 1726867445.32769: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ec 25039 1726867445.32853: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ec 25039 1726867445.32856: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25039 1726867445.32902: no more pending results, returning what we have 25039 1726867445.32905: results queue empty 25039 1726867445.32906: checking for any_errors_fatal 25039 1726867445.32913: done checking for any_errors_fatal 25039 1726867445.32914: checking for max_fail_percentage 25039 1726867445.32915: done checking for max_fail_percentage 25039 1726867445.32916: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.32917: done checking to see if all hosts have failed 25039 1726867445.32917: getting the remaining hosts for this loop 25039 1726867445.32918: done getting the remaining hosts for this loop 25039 1726867445.32921: getting the next task for host managed_node1 25039 1726867445.32926: done getting next task for host managed_node1 25039 1726867445.32928: ^ task is: TASK: Enable EPEL 8 25039 1726867445.32931: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.32934: getting variables 25039 1726867445.32936: in VariableManager get_vars() 25039 1726867445.32957: Calling all_inventory to load vars for managed_node1 25039 1726867445.32959: Calling groups_inventory to load vars for managed_node1 25039 1726867445.32962: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.32970: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.32972: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.32975: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.33111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.33225: done with get_vars() 25039 1726867445.33231: done getting variables 25039 1726867445.33270: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:24:05 -0400 (0:00:00.014) 0:00:02.858 ****** 25039 1726867445.33291: entering _queue_task() for managed_node1/command 25039 1726867445.33466: worker is 1 (out of 1 available) 25039 1726867445.33478: exiting _queue_task() for managed_node1/command 25039 1726867445.33490: done queuing things up, now waiting for results queue to drain 25039 1726867445.33491: waiting for pending results... 25039 1726867445.33631: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 25039 1726867445.33755: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000ed 25039 1726867445.33759: variable 'ansible_search_path' from source: unknown 25039 1726867445.33763: variable 'ansible_search_path' from source: unknown 25039 1726867445.33766: calling self._execute() 25039 1726867445.33807: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.33813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.33820: variable 'omit' from source: magic vars 25039 1726867445.34069: variable 'ansible_distribution' from source: facts 25039 1726867445.34082: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25039 1726867445.34160: variable 'ansible_distribution_major_version' from source: facts 25039 1726867445.34163: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25039 1726867445.34166: when evaluation is False, skipping this task 25039 1726867445.34174: _execute() done 25039 1726867445.34179: dumping result to json 25039 1726867445.34181: done dumping result, returning 25039 1726867445.34185: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affcac9-a3a5-3ddc-7272-0000000000ed] 25039 1726867445.34195: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ed 25039 1726867445.34340: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ed 25039 1726867445.34343: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25039 1726867445.34371: no more pending results, returning what we have 25039 1726867445.34373: results queue empty 25039 1726867445.34374: checking for any_errors_fatal 25039 1726867445.34379: done checking for any_errors_fatal 25039 1726867445.34380: checking for max_fail_percentage 25039 1726867445.34381: done checking for max_fail_percentage 25039 1726867445.34382: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.34382: done checking to see if all hosts have failed 25039 1726867445.34383: getting the remaining hosts for this loop 25039 1726867445.34384: done getting the remaining hosts for this loop 25039 1726867445.34385: getting the next task for host managed_node1 25039 1726867445.34392: done getting next task for host managed_node1 25039 1726867445.34393: ^ task is: TASK: Enable EPEL 6 25039 1726867445.34396: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.34398: getting variables 25039 1726867445.34399: in VariableManager get_vars() 25039 1726867445.34416: Calling all_inventory to load vars for managed_node1 25039 1726867445.34418: Calling groups_inventory to load vars for managed_node1 25039 1726867445.34420: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.34426: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.34428: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.34429: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.34534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.34647: done with get_vars() 25039 1726867445.34654: done getting variables 25039 1726867445.34696: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:24:05 -0400 (0:00:00.014) 0:00:02.872 ****** 25039 1726867445.34717: entering _queue_task() for managed_node1/copy 25039 1726867445.34892: worker is 1 (out of 1 available) 25039 1726867445.34905: exiting _queue_task() for managed_node1/copy 25039 1726867445.34915: done queuing things up, now waiting for results queue to drain 25039 1726867445.34917: waiting for pending results... 25039 1726867445.35066: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 25039 1726867445.35130: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000ef 25039 1726867445.35143: variable 'ansible_search_path' from source: unknown 25039 1726867445.35146: variable 'ansible_search_path' from source: unknown 25039 1726867445.35172: calling self._execute() 25039 1726867445.35224: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.35229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.35237: variable 'omit' from source: magic vars 25039 1726867445.35705: variable 'ansible_distribution' from source: facts 25039 1726867445.35711: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25039 1726867445.35714: variable 'ansible_distribution_major_version' from source: facts 25039 1726867445.35716: Evaluated conditional (ansible_distribution_major_version == '6'): False 25039 1726867445.35718: when evaluation is False, skipping this task 25039 1726867445.35720: _execute() done 25039 1726867445.35722: dumping result to json 25039 1726867445.35724: done dumping result, returning 25039 1726867445.35733: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affcac9-a3a5-3ddc-7272-0000000000ef] 25039 1726867445.35742: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ef 25039 1726867445.36245: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000ef 25039 1726867445.36249: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 25039 1726867445.36289: no more pending results, returning what we have 25039 1726867445.36292: results queue empty 25039 1726867445.36293: checking for any_errors_fatal 25039 1726867445.36296: done checking for any_errors_fatal 25039 1726867445.36297: checking for max_fail_percentage 25039 1726867445.36298: done checking for max_fail_percentage 25039 1726867445.36299: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.36300: done checking to see if all hosts have failed 25039 1726867445.36300: getting the remaining hosts for this loop 25039 1726867445.36302: done getting the remaining hosts for this loop 25039 1726867445.36305: getting the next task for host managed_node1 25039 1726867445.36311: done getting next task for host managed_node1 25039 1726867445.36314: ^ task is: TASK: Set network provider to 'nm' 25039 1726867445.36316: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.36387: getting variables 25039 1726867445.36389: in VariableManager get_vars() 25039 1726867445.36414: Calling all_inventory to load vars for managed_node1 25039 1726867445.36417: Calling groups_inventory to load vars for managed_node1 25039 1726867445.36420: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.36486: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.36490: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.36494: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.36913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.37172: done with get_vars() 25039 1726867445.37185: done getting variables 25039 1726867445.37259: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Friday 20 September 2024 17:24:05 -0400 (0:00:00.025) 0:00:02.898 ****** 25039 1726867445.37291: entering _queue_task() for managed_node1/set_fact 25039 1726867445.37579: worker is 1 (out of 1 available) 25039 1726867445.37593: exiting _queue_task() for managed_node1/set_fact 25039 1726867445.37605: done queuing things up, now waiting for results queue to drain 25039 1726867445.37607: waiting for pending results... 25039 1726867445.37841: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 25039 1726867445.38215: in run() - task 0affcac9-a3a5-3ddc-7272-000000000007 25039 1726867445.38220: variable 'ansible_search_path' from source: unknown 25039 1726867445.38223: calling self._execute() 25039 1726867445.38244: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.38258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.38274: variable 'omit' from source: magic vars 25039 1726867445.38396: variable 'omit' from source: magic vars 25039 1726867445.38439: variable 'omit' from source: magic vars 25039 1726867445.38487: variable 'omit' from source: magic vars 25039 1726867445.38545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867445.38591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867445.38627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867445.38652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867445.38670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867445.38715: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867445.38728: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.38741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.38888: Set connection var ansible_shell_executable to /bin/sh 25039 1726867445.38931: Set connection var ansible_timeout to 10 25039 1726867445.38934: Set connection var ansible_shell_type to sh 25039 1726867445.38936: Set connection var ansible_connection to ssh 25039 1726867445.38939: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867445.39061: Set connection var ansible_pipelining to False 25039 1726867445.39065: variable 'ansible_shell_executable' from source: unknown 25039 1726867445.39067: variable 'ansible_connection' from source: unknown 25039 1726867445.39069: variable 'ansible_module_compression' from source: unknown 25039 1726867445.39071: variable 'ansible_shell_type' from source: unknown 25039 1726867445.39073: variable 'ansible_shell_executable' from source: unknown 25039 1726867445.39076: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.39079: variable 'ansible_pipelining' from source: unknown 25039 1726867445.39081: variable 'ansible_timeout' from source: unknown 25039 1726867445.39083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.39237: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867445.39292: variable 'omit' from source: magic vars 25039 1726867445.39303: starting attempt loop 25039 1726867445.39315: running the handler 25039 1726867445.39339: handler run complete 25039 1726867445.39356: attempt loop complete, returning result 25039 1726867445.39364: _execute() done 25039 1726867445.39371: dumping result to json 25039 1726867445.39384: done dumping result, returning 25039 1726867445.39396: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affcac9-a3a5-3ddc-7272-000000000007] 25039 1726867445.39404: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000007 25039 1726867445.39696: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000007 25039 1726867445.39699: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 25039 1726867445.39746: no more pending results, returning what we have 25039 1726867445.39748: results queue empty 25039 1726867445.39749: checking for any_errors_fatal 25039 1726867445.39754: done checking for any_errors_fatal 25039 1726867445.39754: checking for max_fail_percentage 25039 1726867445.39756: done checking for max_fail_percentage 25039 1726867445.39756: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.39757: done checking to see if all hosts have failed 25039 1726867445.39758: getting the remaining hosts for this loop 25039 1726867445.39759: done getting the remaining hosts for this loop 25039 1726867445.39762: getting the next task for host managed_node1 25039 1726867445.39767: done getting next task for host managed_node1 25039 1726867445.39769: ^ task is: TASK: meta (flush_handlers) 25039 1726867445.39770: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.39774: getting variables 25039 1726867445.39776: in VariableManager get_vars() 25039 1726867445.39806: Calling all_inventory to load vars for managed_node1 25039 1726867445.39812: Calling groups_inventory to load vars for managed_node1 25039 1726867445.39816: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.39827: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.39830: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.39833: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.40009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.40295: done with get_vars() 25039 1726867445.40306: done getting variables 25039 1726867445.40370: in VariableManager get_vars() 25039 1726867445.40740: Calling all_inventory to load vars for managed_node1 25039 1726867445.40743: Calling groups_inventory to load vars for managed_node1 25039 1726867445.40746: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.40751: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.40753: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.40756: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.41100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.41930: done with get_vars() 25039 1726867445.41947: done queuing things up, now waiting for results queue to drain 25039 1726867445.41949: results queue empty 25039 1726867445.41950: checking for any_errors_fatal 25039 1726867445.41952: done checking for any_errors_fatal 25039 1726867445.41953: checking for max_fail_percentage 25039 1726867445.41954: done checking for max_fail_percentage 25039 1726867445.41954: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.41955: done checking to see if all hosts have failed 25039 1726867445.41956: getting the remaining hosts for this loop 25039 1726867445.41957: done getting the remaining hosts for this loop 25039 1726867445.41960: getting the next task for host managed_node1 25039 1726867445.41964: done getting next task for host managed_node1 25039 1726867445.41965: ^ task is: TASK: meta (flush_handlers) 25039 1726867445.41966: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.41975: getting variables 25039 1726867445.41976: in VariableManager get_vars() 25039 1726867445.42190: Calling all_inventory to load vars for managed_node1 25039 1726867445.42193: Calling groups_inventory to load vars for managed_node1 25039 1726867445.42196: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.42203: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.42206: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.42211: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.42353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.42744: done with get_vars() 25039 1726867445.42754: done getting variables 25039 1726867445.42880: in VariableManager get_vars() 25039 1726867445.42890: Calling all_inventory to load vars for managed_node1 25039 1726867445.42893: Calling groups_inventory to load vars for managed_node1 25039 1726867445.42895: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.42900: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.42902: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.42905: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.43043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.43271: done with get_vars() 25039 1726867445.43288: done queuing things up, now waiting for results queue to drain 25039 1726867445.43290: results queue empty 25039 1726867445.43291: checking for any_errors_fatal 25039 1726867445.43292: done checking for any_errors_fatal 25039 1726867445.43293: checking for max_fail_percentage 25039 1726867445.43294: done checking for max_fail_percentage 25039 1726867445.43294: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.43295: done checking to see if all hosts have failed 25039 1726867445.43296: getting the remaining hosts for this loop 25039 1726867445.43297: done getting the remaining hosts for this loop 25039 1726867445.43299: getting the next task for host managed_node1 25039 1726867445.43302: done getting next task for host managed_node1 25039 1726867445.43303: ^ task is: None 25039 1726867445.43304: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.43305: done queuing things up, now waiting for results queue to drain 25039 1726867445.43306: results queue empty 25039 1726867445.43309: checking for any_errors_fatal 25039 1726867445.43310: done checking for any_errors_fatal 25039 1726867445.43311: checking for max_fail_percentage 25039 1726867445.43312: done checking for max_fail_percentage 25039 1726867445.43312: checking to see if all hosts have failed and the running result is not ok 25039 1726867445.43313: done checking to see if all hosts have failed 25039 1726867445.43315: getting the next task for host managed_node1 25039 1726867445.43317: done getting next task for host managed_node1 25039 1726867445.43318: ^ task is: None 25039 1726867445.43319: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.43370: in VariableManager get_vars() 25039 1726867445.43398: done with get_vars() 25039 1726867445.43406: in VariableManager get_vars() 25039 1726867445.43424: done with get_vars() 25039 1726867445.43429: variable 'omit' from source: magic vars 25039 1726867445.43459: in VariableManager get_vars() 25039 1726867445.43473: done with get_vars() 25039 1726867445.43498: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 25039 1726867445.43875: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25039 1726867445.43902: getting the remaining hosts for this loop 25039 1726867445.43903: done getting the remaining hosts for this loop 25039 1726867445.43905: getting the next task for host managed_node1 25039 1726867445.43910: done getting next task for host managed_node1 25039 1726867445.43912: ^ task is: TASK: Gathering Facts 25039 1726867445.43913: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867445.43915: getting variables 25039 1726867445.43916: in VariableManager get_vars() 25039 1726867445.43927: Calling all_inventory to load vars for managed_node1 25039 1726867445.43929: Calling groups_inventory to load vars for managed_node1 25039 1726867445.43930: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867445.43935: Calling all_plugins_play to load vars for managed_node1 25039 1726867445.43946: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867445.43949: Calling groups_plugins_play to load vars for managed_node1 25039 1726867445.44066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867445.44255: done with get_vars() 25039 1726867445.44264: done getting variables 25039 1726867445.44311: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Friday 20 September 2024 17:24:05 -0400 (0:00:00.070) 0:00:02.968 ****** 25039 1726867445.44335: entering _queue_task() for managed_node1/gather_facts 25039 1726867445.44900: worker is 1 (out of 1 available) 25039 1726867445.44910: exiting _queue_task() for managed_node1/gather_facts 25039 1726867445.44920: done queuing things up, now waiting for results queue to drain 25039 1726867445.44922: waiting for pending results... 25039 1726867445.45016: running TaskExecutor() for managed_node1/TASK: Gathering Facts 25039 1726867445.45147: in run() - task 0affcac9-a3a5-3ddc-7272-000000000115 25039 1726867445.45152: variable 'ansible_search_path' from source: unknown 25039 1726867445.45240: calling self._execute() 25039 1726867445.45289: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.45364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.45368: variable 'omit' from source: magic vars 25039 1726867445.45801: variable 'ansible_distribution_major_version' from source: facts 25039 1726867445.45823: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867445.45834: variable 'omit' from source: magic vars 25039 1726867445.45869: variable 'omit' from source: magic vars 25039 1726867445.45911: variable 'omit' from source: magic vars 25039 1726867445.45954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867445.45998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867445.46026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867445.46147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867445.46184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867445.46201: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867445.46211: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.46292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.46326: Set connection var ansible_shell_executable to /bin/sh 25039 1726867445.46338: Set connection var ansible_timeout to 10 25039 1726867445.46348: Set connection var ansible_shell_type to sh 25039 1726867445.46354: Set connection var ansible_connection to ssh 25039 1726867445.46364: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867445.46372: Set connection var ansible_pipelining to False 25039 1726867445.46403: variable 'ansible_shell_executable' from source: unknown 25039 1726867445.46414: variable 'ansible_connection' from source: unknown 25039 1726867445.46421: variable 'ansible_module_compression' from source: unknown 25039 1726867445.46428: variable 'ansible_shell_type' from source: unknown 25039 1726867445.46435: variable 'ansible_shell_executable' from source: unknown 25039 1726867445.46441: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867445.46448: variable 'ansible_pipelining' from source: unknown 25039 1726867445.46456: variable 'ansible_timeout' from source: unknown 25039 1726867445.46463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867445.46646: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867445.46725: variable 'omit' from source: magic vars 25039 1726867445.46728: starting attempt loop 25039 1726867445.46731: running the handler 25039 1726867445.46734: variable 'ansible_facts' from source: unknown 25039 1726867445.46735: _low_level_execute_command(): starting 25039 1726867445.46737: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867445.47558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867445.47619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867445.47675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867445.49963: stdout chunk (state=3): >>>/root <<< 25039 1726867445.50174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867445.50214: stdout chunk (state=3): >>><<< 25039 1726867445.50282: stderr chunk (state=3): >>><<< 25039 1726867445.50287: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25039 1726867445.50290: _low_level_execute_command(): starting 25039 1726867445.50292: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589 `" && echo ansible-tmp-1726867445.5024798-25216-134528972887589="` echo /root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589 `" ) && sleep 0' 25039 1726867445.51121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867445.51135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867445.51149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867445.51164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867445.51184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867445.51198: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867445.51217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867445.51295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867445.51336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867445.51352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867445.51375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867445.51468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867445.54136: stdout chunk (state=3): >>>ansible-tmp-1726867445.5024798-25216-134528972887589=/root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589 <<< 25039 1726867445.54290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867445.54343: stderr chunk (state=3): >>><<< 25039 1726867445.54359: stdout chunk (state=3): >>><<< 25039 1726867445.54385: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867445.5024798-25216-134528972887589=/root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25039 1726867445.54445: variable 'ansible_module_compression' from source: unknown 25039 1726867445.54698: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25039 1726867445.54761: variable 'ansible_facts' from source: unknown 25039 1726867445.54971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/AnsiballZ_setup.py 25039 1726867445.55204: Sending initial data 25039 1726867445.55210: Sent initial data (154 bytes) 25039 1726867445.55714: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867445.55792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867445.55831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867445.55846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867445.55864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867445.55948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867445.58218: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867445.58268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867445.58354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpzc9ppaz2 /root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/AnsiballZ_setup.py <<< 25039 1726867445.58357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/AnsiballZ_setup.py" <<< 25039 1726867445.58431: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpzc9ppaz2" to remote "/root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/AnsiballZ_setup.py" <<< 25039 1726867445.59962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867445.60040: stderr chunk (state=3): >>><<< 25039 1726867445.60051: stdout chunk (state=3): >>><<< 25039 1726867445.60144: done transferring module to remote 25039 1726867445.60147: _low_level_execute_command(): starting 25039 1726867445.60151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/ /root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/AnsiballZ_setup.py && sleep 0' 25039 1726867445.60752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867445.60765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867445.60785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867445.60818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867445.60920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867445.60943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867445.60961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867445.60986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867445.61073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867445.63600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867445.63625: stderr chunk (state=3): >>><<< 25039 1726867445.63628: stdout chunk (state=3): >>><<< 25039 1726867445.63641: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25039 1726867445.63644: _low_level_execute_command(): starting 25039 1726867445.63650: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/AnsiballZ_setup.py && sleep 0' 25039 1726867445.64074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867445.64080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867445.64083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867445.64085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867445.64087: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867445.64135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867445.64138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867445.64201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867446.48291: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "a<<< 25039 1726867446.48499: stdout chunk (state=3): >>>nsible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2935, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 596, "free": 2935}, "nocache": {"free": 3274, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 691, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794168832, "block_size": 4096, "block_total": 65519099, "block_available": 63914592, "block_used": 1604507, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "24", "second": "06", "epoch": "1726867446", "epoch_int": "1726867446", "date": "2024-09-20", "time": "17:24:06", "iso8601_micro": "2024-09-20T21:24:06.415366Z", "iso8601": "2024-09-20T21:24:06Z", "iso8601_basic": "20240920T172406415366", "iso8601_basic_short": "20240920T172406", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:2<<< 25039 1726867446.48529: stdout chunk (state=3): >>>56M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_loadavg": {"1m": 0.78515625, "5m": 0.462890625, "15m": 0.2392578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25039 1726867446.51097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867446.51174: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 25039 1726867446.51183: stdout chunk (state=3): >>><<< 25039 1726867446.51196: stderr chunk (state=3): >>><<< 25039 1726867446.51386: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2935, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 596, "free": 2935}, "nocache": {"free": 3274, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 691, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794168832, "block_size": 4096, "block_total": 65519099, "block_available": 63914592, "block_used": 1604507, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "24", "second": "06", "epoch": "1726867446", "epoch_int": "1726867446", "date": "2024-09-20", "time": "17:24:06", "iso8601_micro": "2024-09-20T21:24:06.415366Z", "iso8601": "2024-09-20T21:24:06Z", "iso8601_basic": "20240920T172406415366", "iso8601_basic_short": "20240920T172406", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_loadavg": {"1m": 0.78515625, "5m": 0.462890625, "15m": 0.2392578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867446.51582: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867446.51614: _low_level_execute_command(): starting 25039 1726867446.51624: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867445.5024798-25216-134528972887589/ > /dev/null 2>&1 && sleep 0' 25039 1726867446.52219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867446.52235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867446.52247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867446.52290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867446.52360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867446.52376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867446.52397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867446.52483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25039 1726867446.55043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867446.55053: stdout chunk (state=3): >>><<< 25039 1726867446.55063: stderr chunk (state=3): >>><<< 25039 1726867446.55086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25039 1726867446.55099: handler run complete 25039 1726867446.55234: variable 'ansible_facts' from source: unknown 25039 1726867446.55336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.55664: variable 'ansible_facts' from source: unknown 25039 1726867446.55772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.55918: attempt loop complete, returning result 25039 1726867446.55927: _execute() done 25039 1726867446.55933: dumping result to json 25039 1726867446.55966: done dumping result, returning 25039 1726867446.55979: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcac9-a3a5-3ddc-7272-000000000115] 25039 1726867446.55990: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000115 25039 1726867446.56595: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000115 25039 1726867446.56598: WORKER PROCESS EXITING ok: [managed_node1] 25039 1726867446.56876: no more pending results, returning what we have 25039 1726867446.56922: results queue empty 25039 1726867446.56923: checking for any_errors_fatal 25039 1726867446.56925: done checking for any_errors_fatal 25039 1726867446.56925: checking for max_fail_percentage 25039 1726867446.56927: done checking for max_fail_percentage 25039 1726867446.56927: checking to see if all hosts have failed and the running result is not ok 25039 1726867446.56929: done checking to see if all hosts have failed 25039 1726867446.56930: getting the remaining hosts for this loop 25039 1726867446.56931: done getting the remaining hosts for this loop 25039 1726867446.56934: getting the next task for host managed_node1 25039 1726867446.56939: done getting next task for host managed_node1 25039 1726867446.56941: ^ task is: TASK: meta (flush_handlers) 25039 1726867446.56943: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867446.56946: getting variables 25039 1726867446.56948: in VariableManager get_vars() 25039 1726867446.56974: Calling all_inventory to load vars for managed_node1 25039 1726867446.56976: Calling groups_inventory to load vars for managed_node1 25039 1726867446.56980: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867446.56991: Calling all_plugins_play to load vars for managed_node1 25039 1726867446.56997: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867446.57001: Calling groups_plugins_play to load vars for managed_node1 25039 1726867446.57174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.57392: done with get_vars() 25039 1726867446.57403: done getting variables 25039 1726867446.57478: in VariableManager get_vars() 25039 1726867446.57492: Calling all_inventory to load vars for managed_node1 25039 1726867446.57494: Calling groups_inventory to load vars for managed_node1 25039 1726867446.57496: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867446.57500: Calling all_plugins_play to load vars for managed_node1 25039 1726867446.57503: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867446.57505: Calling groups_plugins_play to load vars for managed_node1 25039 1726867446.57651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.57841: done with get_vars() 25039 1726867446.57852: done queuing things up, now waiting for results queue to drain 25039 1726867446.57854: results queue empty 25039 1726867446.57855: checking for any_errors_fatal 25039 1726867446.57857: done checking for any_errors_fatal 25039 1726867446.57858: checking for max_fail_percentage 25039 1726867446.57859: done checking for max_fail_percentage 25039 1726867446.57867: checking to see if all hosts have failed and the running result is not ok 25039 1726867446.57868: done checking to see if all hosts have failed 25039 1726867446.57869: getting the remaining hosts for this loop 25039 1726867446.57870: done getting the remaining hosts for this loop 25039 1726867446.57872: getting the next task for host managed_node1 25039 1726867446.57875: done getting next task for host managed_node1 25039 1726867446.57880: ^ task is: TASK: Include the task 'show_interfaces.yml' 25039 1726867446.57881: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867446.57883: getting variables 25039 1726867446.57884: in VariableManager get_vars() 25039 1726867446.57900: Calling all_inventory to load vars for managed_node1 25039 1726867446.57902: Calling groups_inventory to load vars for managed_node1 25039 1726867446.57903: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867446.57907: Calling all_plugins_play to load vars for managed_node1 25039 1726867446.57909: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867446.57911: Calling groups_plugins_play to load vars for managed_node1 25039 1726867446.58051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.58345: done with get_vars() 25039 1726867446.58353: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Friday 20 September 2024 17:24:06 -0400 (0:00:01.141) 0:00:04.110 ****** 25039 1726867446.58475: entering _queue_task() for managed_node1/include_tasks 25039 1726867446.58994: worker is 1 (out of 1 available) 25039 1726867446.59001: exiting _queue_task() for managed_node1/include_tasks 25039 1726867446.59010: done queuing things up, now waiting for results queue to drain 25039 1726867446.59012: waiting for pending results... 25039 1726867446.59137: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 25039 1726867446.59253: in run() - task 0affcac9-a3a5-3ddc-7272-00000000000b 25039 1726867446.59296: variable 'ansible_search_path' from source: unknown 25039 1726867446.59317: calling self._execute() 25039 1726867446.59436: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867446.59522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867446.59569: variable 'omit' from source: magic vars 25039 1726867446.59965: variable 'ansible_distribution_major_version' from source: facts 25039 1726867446.59987: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867446.60010: _execute() done 25039 1726867446.60020: dumping result to json 25039 1726867446.60027: done dumping result, returning 25039 1726867446.60083: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-3ddc-7272-00000000000b] 25039 1726867446.60087: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000000b 25039 1726867446.60246: no more pending results, returning what we have 25039 1726867446.60252: in VariableManager get_vars() 25039 1726867446.60302: Calling all_inventory to load vars for managed_node1 25039 1726867446.60305: Calling groups_inventory to load vars for managed_node1 25039 1726867446.60308: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867446.60385: Calling all_plugins_play to load vars for managed_node1 25039 1726867446.60389: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867446.60394: Calling groups_plugins_play to load vars for managed_node1 25039 1726867446.60695: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000000b 25039 1726867446.60699: WORKER PROCESS EXITING 25039 1726867446.60718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.60923: done with get_vars() 25039 1726867446.60930: variable 'ansible_search_path' from source: unknown 25039 1726867446.60942: we have included files to process 25039 1726867446.60943: generating all_blocks data 25039 1726867446.60944: done generating all_blocks data 25039 1726867446.60945: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867446.60946: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867446.60948: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867446.61111: in VariableManager get_vars() 25039 1726867446.61130: done with get_vars() 25039 1726867446.61247: done processing included file 25039 1726867446.61249: iterating over new_blocks loaded from include file 25039 1726867446.61251: in VariableManager get_vars() 25039 1726867446.61269: done with get_vars() 25039 1726867446.61271: filtering new block on tags 25039 1726867446.61290: done filtering new block on tags 25039 1726867446.61292: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 25039 1726867446.61301: extending task lists for all hosts with included blocks 25039 1726867446.61366: done extending task lists 25039 1726867446.61367: done processing included files 25039 1726867446.61368: results queue empty 25039 1726867446.61369: checking for any_errors_fatal 25039 1726867446.61371: done checking for any_errors_fatal 25039 1726867446.61371: checking for max_fail_percentage 25039 1726867446.61373: done checking for max_fail_percentage 25039 1726867446.61373: checking to see if all hosts have failed and the running result is not ok 25039 1726867446.61374: done checking to see if all hosts have failed 25039 1726867446.61375: getting the remaining hosts for this loop 25039 1726867446.61376: done getting the remaining hosts for this loop 25039 1726867446.61382: getting the next task for host managed_node1 25039 1726867446.61386: done getting next task for host managed_node1 25039 1726867446.61388: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25039 1726867446.61391: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867446.61393: getting variables 25039 1726867446.61394: in VariableManager get_vars() 25039 1726867446.61421: Calling all_inventory to load vars for managed_node1 25039 1726867446.61424: Calling groups_inventory to load vars for managed_node1 25039 1726867446.61426: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867446.61431: Calling all_plugins_play to load vars for managed_node1 25039 1726867446.61433: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867446.61436: Calling groups_plugins_play to load vars for managed_node1 25039 1726867446.61612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.61876: done with get_vars() 25039 1726867446.61888: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:24:06 -0400 (0:00:00.034) 0:00:04.145 ****** 25039 1726867446.61960: entering _queue_task() for managed_node1/include_tasks 25039 1726867446.62302: worker is 1 (out of 1 available) 25039 1726867446.62311: exiting _queue_task() for managed_node1/include_tasks 25039 1726867446.62321: done queuing things up, now waiting for results queue to drain 25039 1726867446.62322: waiting for pending results... 25039 1726867446.62497: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 25039 1726867446.62565: in run() - task 0affcac9-a3a5-3ddc-7272-00000000012b 25039 1726867446.62586: variable 'ansible_search_path' from source: unknown 25039 1726867446.62598: variable 'ansible_search_path' from source: unknown 25039 1726867446.62636: calling self._execute() 25039 1726867446.62722: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867446.62768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867446.62771: variable 'omit' from source: magic vars 25039 1726867446.63111: variable 'ansible_distribution_major_version' from source: facts 25039 1726867446.63128: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867446.63145: _execute() done 25039 1726867446.63154: dumping result to json 25039 1726867446.63182: done dumping result, returning 25039 1726867446.63186: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-3ddc-7272-00000000012b] 25039 1726867446.63188: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000012b 25039 1726867446.63381: no more pending results, returning what we have 25039 1726867446.63386: in VariableManager get_vars() 25039 1726867446.63544: Calling all_inventory to load vars for managed_node1 25039 1726867446.63547: Calling groups_inventory to load vars for managed_node1 25039 1726867446.63549: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867446.63559: Calling all_plugins_play to load vars for managed_node1 25039 1726867446.63562: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867446.63564: Calling groups_plugins_play to load vars for managed_node1 25039 1726867446.63767: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000012b 25039 1726867446.63771: WORKER PROCESS EXITING 25039 1726867446.63814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.64125: done with get_vars() 25039 1726867446.64133: variable 'ansible_search_path' from source: unknown 25039 1726867446.64134: variable 'ansible_search_path' from source: unknown 25039 1726867446.64170: we have included files to process 25039 1726867446.64172: generating all_blocks data 25039 1726867446.64173: done generating all_blocks data 25039 1726867446.64174: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867446.64175: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867446.64179: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867446.64928: done processing included file 25039 1726867446.64930: iterating over new_blocks loaded from include file 25039 1726867446.64932: in VariableManager get_vars() 25039 1726867446.64954: done with get_vars() 25039 1726867446.64956: filtering new block on tags 25039 1726867446.64973: done filtering new block on tags 25039 1726867446.64975: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 25039 1726867446.64985: extending task lists for all hosts with included blocks 25039 1726867446.65094: done extending task lists 25039 1726867446.65096: done processing included files 25039 1726867446.65097: results queue empty 25039 1726867446.65097: checking for any_errors_fatal 25039 1726867446.65100: done checking for any_errors_fatal 25039 1726867446.65101: checking for max_fail_percentage 25039 1726867446.65102: done checking for max_fail_percentage 25039 1726867446.65102: checking to see if all hosts have failed and the running result is not ok 25039 1726867446.65103: done checking to see if all hosts have failed 25039 1726867446.65104: getting the remaining hosts for this loop 25039 1726867446.65105: done getting the remaining hosts for this loop 25039 1726867446.65108: getting the next task for host managed_node1 25039 1726867446.65112: done getting next task for host managed_node1 25039 1726867446.65114: ^ task is: TASK: Gather current interface info 25039 1726867446.65117: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867446.65119: getting variables 25039 1726867446.65120: in VariableManager get_vars() 25039 1726867446.65132: Calling all_inventory to load vars for managed_node1 25039 1726867446.65134: Calling groups_inventory to load vars for managed_node1 25039 1726867446.65137: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867446.65142: Calling all_plugins_play to load vars for managed_node1 25039 1726867446.65144: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867446.65147: Calling groups_plugins_play to load vars for managed_node1 25039 1726867446.65248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867446.65370: done with get_vars() 25039 1726867446.65376: done getting variables 25039 1726867446.65404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:24:06 -0400 (0:00:00.034) 0:00:04.179 ****** 25039 1726867446.65425: entering _queue_task() for managed_node1/command 25039 1726867446.65605: worker is 1 (out of 1 available) 25039 1726867446.65618: exiting _queue_task() for managed_node1/command 25039 1726867446.65629: done queuing things up, now waiting for results queue to drain 25039 1726867446.65631: waiting for pending results... 25039 1726867446.65780: running TaskExecutor() for managed_node1/TASK: Gather current interface info 25039 1726867446.65839: in run() - task 0affcac9-a3a5-3ddc-7272-00000000013a 25039 1726867446.65852: variable 'ansible_search_path' from source: unknown 25039 1726867446.65857: variable 'ansible_search_path' from source: unknown 25039 1726867446.65885: calling self._execute() 25039 1726867446.65945: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867446.65949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867446.65956: variable 'omit' from source: magic vars 25039 1726867446.66246: variable 'ansible_distribution_major_version' from source: facts 25039 1726867446.66256: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867446.66263: variable 'omit' from source: magic vars 25039 1726867446.66291: variable 'omit' from source: magic vars 25039 1726867446.66322: variable 'omit' from source: magic vars 25039 1726867446.66352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867446.66378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867446.66394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867446.66411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867446.66423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867446.66446: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867446.66449: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867446.66451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867446.66523: Set connection var ansible_shell_executable to /bin/sh 25039 1726867446.66527: Set connection var ansible_timeout to 10 25039 1726867446.66535: Set connection var ansible_shell_type to sh 25039 1726867446.66537: Set connection var ansible_connection to ssh 25039 1726867446.66543: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867446.66548: Set connection var ansible_pipelining to False 25039 1726867446.66565: variable 'ansible_shell_executable' from source: unknown 25039 1726867446.66567: variable 'ansible_connection' from source: unknown 25039 1726867446.66570: variable 'ansible_module_compression' from source: unknown 25039 1726867446.66572: variable 'ansible_shell_type' from source: unknown 25039 1726867446.66575: variable 'ansible_shell_executable' from source: unknown 25039 1726867446.66579: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867446.66583: variable 'ansible_pipelining' from source: unknown 25039 1726867446.66585: variable 'ansible_timeout' from source: unknown 25039 1726867446.66589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867446.66686: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867446.66708: variable 'omit' from source: magic vars 25039 1726867446.66711: starting attempt loop 25039 1726867446.66723: running the handler 25039 1726867446.66738: _low_level_execute_command(): starting 25039 1726867446.66752: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867446.67635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867446.67638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867446.67641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867446.67693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867446.67939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867446.68115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867446.68151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867446.70225: stdout chunk (state=3): >>>/root <<< 25039 1726867446.70463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867446.70467: stdout chunk (state=3): >>><<< 25039 1726867446.70469: stderr chunk (state=3): >>><<< 25039 1726867446.70474: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867446.70479: _low_level_execute_command(): starting 25039 1726867446.70482: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874 `" && echo ansible-tmp-1726867446.7044332-25270-92968087150874="` echo /root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874 `" ) && sleep 0' 25039 1726867446.71684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867446.71694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867446.71772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867446.73641: stdout chunk (state=3): >>>ansible-tmp-1726867446.7044332-25270-92968087150874=/root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874 <<< 25039 1726867446.73844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867446.73847: stdout chunk (state=3): >>><<< 25039 1726867446.73853: stderr chunk (state=3): >>><<< 25039 1726867446.73870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867446.7044332-25270-92968087150874=/root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867446.73903: variable 'ansible_module_compression' from source: unknown 25039 1726867446.73955: ANSIBALLZ: Using generic lock for ansible.legacy.command 25039 1726867446.73960: ANSIBALLZ: Acquiring lock 25039 1726867446.73963: ANSIBALLZ: Lock acquired: 140682442827552 25039 1726867446.73965: ANSIBALLZ: Creating module 25039 1726867446.89872: ANSIBALLZ: Writing module into payload 25039 1726867446.89969: ANSIBALLZ: Writing module 25039 1726867446.89992: ANSIBALLZ: Renaming module 25039 1726867446.90084: ANSIBALLZ: Done creating module 25039 1726867446.90087: variable 'ansible_facts' from source: unknown 25039 1726867446.90095: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/AnsiballZ_command.py 25039 1726867446.90329: Sending initial data 25039 1726867446.90333: Sent initial data (155 bytes) 25039 1726867446.90943: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867446.90957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867446.91042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867446.92688: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867446.92730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867446.92813: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpy25ophpe /root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/AnsiballZ_command.py <<< 25039 1726867446.92817: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/AnsiballZ_command.py" <<< 25039 1726867446.92870: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpy25ophpe" to remote "/root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/AnsiballZ_command.py" <<< 25039 1726867446.94230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867446.94233: stdout chunk (state=3): >>><<< 25039 1726867446.94235: stderr chunk (state=3): >>><<< 25039 1726867446.94237: done transferring module to remote 25039 1726867446.94239: _low_level_execute_command(): starting 25039 1726867446.94241: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/ /root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/AnsiballZ_command.py && sleep 0' 25039 1726867446.95216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867446.95483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867446.95490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867446.95559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867446.97322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867446.97361: stderr chunk (state=3): >>><<< 25039 1726867446.97373: stdout chunk (state=3): >>><<< 25039 1726867446.97395: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867446.97402: _low_level_execute_command(): starting 25039 1726867446.97406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/AnsiballZ_command.py && sleep 0' 25039 1726867446.97981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867446.97991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867446.98001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867446.98014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867446.98034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867446.98081: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867446.98085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867446.98087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867446.98090: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867446.98092: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25039 1726867446.98094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867446.98096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867446.98098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867446.98109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867446.98112: stderr chunk (state=3): >>>debug2: match found <<< 25039 1726867446.98139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867446.98247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867446.98252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867446.98293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.13620: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:24:07.130886", "end": "2024-09-20 17:24:07.134029", "delta": "0:00:00.003143", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867447.15120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867447.15147: stderr chunk (state=3): >>><<< 25039 1726867447.15150: stdout chunk (state=3): >>><<< 25039 1726867447.15166: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:24:07.130886", "end": "2024-09-20 17:24:07.134029", "delta": "0:00:00.003143", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867447.15196: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867447.15203: _low_level_execute_command(): starting 25039 1726867447.15208: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867446.7044332-25270-92968087150874/ > /dev/null 2>&1 && sleep 0' 25039 1726867447.15789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867447.15846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.15889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.17682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.17706: stderr chunk (state=3): >>><<< 25039 1726867447.17709: stdout chunk (state=3): >>><<< 25039 1726867447.17727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867447.17734: handler run complete 25039 1726867447.17750: Evaluated conditional (False): False 25039 1726867447.17759: attempt loop complete, returning result 25039 1726867447.17761: _execute() done 25039 1726867447.17764: dumping result to json 25039 1726867447.17768: done dumping result, returning 25039 1726867447.17775: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affcac9-a3a5-3ddc-7272-00000000013a] 25039 1726867447.17781: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000013a 25039 1726867447.17874: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000013a 25039 1726867447.17879: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003143", "end": "2024-09-20 17:24:07.134029", "rc": 0, "start": "2024-09-20 17:24:07.130886" } STDOUT: bonding_masters eth0 lo 25039 1726867447.17967: no more pending results, returning what we have 25039 1726867447.17970: results queue empty 25039 1726867447.17971: checking for any_errors_fatal 25039 1726867447.17973: done checking for any_errors_fatal 25039 1726867447.17973: checking for max_fail_percentage 25039 1726867447.17975: done checking for max_fail_percentage 25039 1726867447.17975: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.17976: done checking to see if all hosts have failed 25039 1726867447.17979: getting the remaining hosts for this loop 25039 1726867447.17980: done getting the remaining hosts for this loop 25039 1726867447.17983: getting the next task for host managed_node1 25039 1726867447.17991: done getting next task for host managed_node1 25039 1726867447.17993: ^ task is: TASK: Set current_interfaces 25039 1726867447.17997: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.18000: getting variables 25039 1726867447.18002: in VariableManager get_vars() 25039 1726867447.18039: Calling all_inventory to load vars for managed_node1 25039 1726867447.18041: Calling groups_inventory to load vars for managed_node1 25039 1726867447.18044: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.18055: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.18058: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.18060: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.18227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.18367: done with get_vars() 25039 1726867447.18388: done getting variables 25039 1726867447.18452: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:24:07 -0400 (0:00:00.530) 0:00:04.710 ****** 25039 1726867447.18486: entering _queue_task() for managed_node1/set_fact 25039 1726867447.18754: worker is 1 (out of 1 available) 25039 1726867447.18766: exiting _queue_task() for managed_node1/set_fact 25039 1726867447.18780: done queuing things up, now waiting for results queue to drain 25039 1726867447.18782: waiting for pending results... 25039 1726867447.19197: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 25039 1726867447.19203: in run() - task 0affcac9-a3a5-3ddc-7272-00000000013b 25039 1726867447.19207: variable 'ansible_search_path' from source: unknown 25039 1726867447.19210: variable 'ansible_search_path' from source: unknown 25039 1726867447.19213: calling self._execute() 25039 1726867447.19268: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.19274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.19285: variable 'omit' from source: magic vars 25039 1726867447.19664: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.19673: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.19682: variable 'omit' from source: magic vars 25039 1726867447.19727: variable 'omit' from source: magic vars 25039 1726867447.19828: variable '_current_interfaces' from source: set_fact 25039 1726867447.19872: variable 'omit' from source: magic vars 25039 1726867447.19904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867447.19949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867447.19960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867447.19976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.19988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.20013: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867447.20017: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.20019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.20090: Set connection var ansible_shell_executable to /bin/sh 25039 1726867447.20094: Set connection var ansible_timeout to 10 25039 1726867447.20099: Set connection var ansible_shell_type to sh 25039 1726867447.20102: Set connection var ansible_connection to ssh 25039 1726867447.20108: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867447.20115: Set connection var ansible_pipelining to False 25039 1726867447.20133: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.20137: variable 'ansible_connection' from source: unknown 25039 1726867447.20139: variable 'ansible_module_compression' from source: unknown 25039 1726867447.20141: variable 'ansible_shell_type' from source: unknown 25039 1726867447.20143: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.20145: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.20149: variable 'ansible_pipelining' from source: unknown 25039 1726867447.20152: variable 'ansible_timeout' from source: unknown 25039 1726867447.20156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.20255: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867447.20262: variable 'omit' from source: magic vars 25039 1726867447.20269: starting attempt loop 25039 1726867447.20272: running the handler 25039 1726867447.20282: handler run complete 25039 1726867447.20290: attempt loop complete, returning result 25039 1726867447.20293: _execute() done 25039 1726867447.20295: dumping result to json 25039 1726867447.20298: done dumping result, returning 25039 1726867447.20306: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affcac9-a3a5-3ddc-7272-00000000013b] 25039 1726867447.20314: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000013b 25039 1726867447.20388: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000013b 25039 1726867447.20391: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 25039 1726867447.20469: no more pending results, returning what we have 25039 1726867447.20472: results queue empty 25039 1726867447.20473: checking for any_errors_fatal 25039 1726867447.20480: done checking for any_errors_fatal 25039 1726867447.20481: checking for max_fail_percentage 25039 1726867447.20482: done checking for max_fail_percentage 25039 1726867447.20483: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.20484: done checking to see if all hosts have failed 25039 1726867447.20484: getting the remaining hosts for this loop 25039 1726867447.20485: done getting the remaining hosts for this loop 25039 1726867447.20488: getting the next task for host managed_node1 25039 1726867447.20494: done getting next task for host managed_node1 25039 1726867447.20496: ^ task is: TASK: Show current_interfaces 25039 1726867447.20499: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.20503: getting variables 25039 1726867447.20505: in VariableManager get_vars() 25039 1726867447.20534: Calling all_inventory to load vars for managed_node1 25039 1726867447.20537: Calling groups_inventory to load vars for managed_node1 25039 1726867447.20539: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.20547: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.20550: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.20552: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.20654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.20768: done with get_vars() 25039 1726867447.20775: done getting variables 25039 1726867447.20839: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:24:07 -0400 (0:00:00.023) 0:00:04.734 ****** 25039 1726867447.20858: entering _queue_task() for managed_node1/debug 25039 1726867447.20859: Creating lock for debug 25039 1726867447.21043: worker is 1 (out of 1 available) 25039 1726867447.21056: exiting _queue_task() for managed_node1/debug 25039 1726867447.21067: done queuing things up, now waiting for results queue to drain 25039 1726867447.21069: waiting for pending results... 25039 1726867447.21204: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 25039 1726867447.21255: in run() - task 0affcac9-a3a5-3ddc-7272-00000000012c 25039 1726867447.21265: variable 'ansible_search_path' from source: unknown 25039 1726867447.21268: variable 'ansible_search_path' from source: unknown 25039 1726867447.21296: calling self._execute() 25039 1726867447.21408: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.21416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.21425: variable 'omit' from source: magic vars 25039 1726867447.21982: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.21985: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.21987: variable 'omit' from source: magic vars 25039 1726867447.21989: variable 'omit' from source: magic vars 25039 1726867447.21991: variable 'current_interfaces' from source: set_fact 25039 1726867447.21993: variable 'omit' from source: magic vars 25039 1726867447.21996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867447.22030: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867447.22052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867447.22074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.22102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.22141: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867447.22149: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.22156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.22247: Set connection var ansible_shell_executable to /bin/sh 25039 1726867447.22264: Set connection var ansible_timeout to 10 25039 1726867447.22270: Set connection var ansible_shell_type to sh 25039 1726867447.22272: Set connection var ansible_connection to ssh 25039 1726867447.22284: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867447.22293: Set connection var ansible_pipelining to False 25039 1726867447.22322: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.22326: variable 'ansible_connection' from source: unknown 25039 1726867447.22329: variable 'ansible_module_compression' from source: unknown 25039 1726867447.22331: variable 'ansible_shell_type' from source: unknown 25039 1726867447.22333: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.22335: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.22337: variable 'ansible_pipelining' from source: unknown 25039 1726867447.22343: variable 'ansible_timeout' from source: unknown 25039 1726867447.22346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.22684: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867447.22688: variable 'omit' from source: magic vars 25039 1726867447.22690: starting attempt loop 25039 1726867447.22693: running the handler 25039 1726867447.22695: handler run complete 25039 1726867447.22697: attempt loop complete, returning result 25039 1726867447.22700: _execute() done 25039 1726867447.22702: dumping result to json 25039 1726867447.22703: done dumping result, returning 25039 1726867447.22706: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affcac9-a3a5-3ddc-7272-00000000012c] 25039 1726867447.22710: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000012c 25039 1726867447.22768: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000012c 25039 1726867447.22772: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 25039 1726867447.22859: no more pending results, returning what we have 25039 1726867447.22862: results queue empty 25039 1726867447.22863: checking for any_errors_fatal 25039 1726867447.22867: done checking for any_errors_fatal 25039 1726867447.22867: checking for max_fail_percentage 25039 1726867447.22868: done checking for max_fail_percentage 25039 1726867447.22869: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.22870: done checking to see if all hosts have failed 25039 1726867447.22871: getting the remaining hosts for this loop 25039 1726867447.22872: done getting the remaining hosts for this loop 25039 1726867447.22875: getting the next task for host managed_node1 25039 1726867447.22889: done getting next task for host managed_node1 25039 1726867447.22892: ^ task is: TASK: Include the task 'manage_test_interface.yml' 25039 1726867447.22894: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.22897: getting variables 25039 1726867447.22899: in VariableManager get_vars() 25039 1726867447.22924: Calling all_inventory to load vars for managed_node1 25039 1726867447.22926: Calling groups_inventory to load vars for managed_node1 25039 1726867447.22929: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.22938: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.22941: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.22943: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.23103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.23301: done with get_vars() 25039 1726867447.23310: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Friday 20 September 2024 17:24:07 -0400 (0:00:00.025) 0:00:04.759 ****** 25039 1726867447.23419: entering _queue_task() for managed_node1/include_tasks 25039 1726867447.23597: worker is 1 (out of 1 available) 25039 1726867447.23609: exiting _queue_task() for managed_node1/include_tasks 25039 1726867447.23620: done queuing things up, now waiting for results queue to drain 25039 1726867447.23622: waiting for pending results... 25039 1726867447.23767: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 25039 1726867447.23816: in run() - task 0affcac9-a3a5-3ddc-7272-00000000000c 25039 1726867447.23826: variable 'ansible_search_path' from source: unknown 25039 1726867447.23855: calling self._execute() 25039 1726867447.23914: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.23918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.23924: variable 'omit' from source: magic vars 25039 1726867447.24158: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.24169: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.24172: _execute() done 25039 1726867447.24175: dumping result to json 25039 1726867447.24188: done dumping result, returning 25039 1726867447.24191: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0affcac9-a3a5-3ddc-7272-00000000000c] 25039 1726867447.24193: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000000c 25039 1726867447.24268: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000000c 25039 1726867447.24270: WORKER PROCESS EXITING 25039 1726867447.24315: no more pending results, returning what we have 25039 1726867447.24319: in VariableManager get_vars() 25039 1726867447.24353: Calling all_inventory to load vars for managed_node1 25039 1726867447.24355: Calling groups_inventory to load vars for managed_node1 25039 1726867447.24357: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.24365: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.24368: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.24370: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.24471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.24603: done with get_vars() 25039 1726867447.24610: variable 'ansible_search_path' from source: unknown 25039 1726867447.24618: we have included files to process 25039 1726867447.24619: generating all_blocks data 25039 1726867447.24620: done generating all_blocks data 25039 1726867447.24623: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25039 1726867447.24624: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25039 1726867447.24625: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25039 1726867447.24934: in VariableManager get_vars() 25039 1726867447.24948: done with get_vars() 25039 1726867447.25089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 25039 1726867447.25558: done processing included file 25039 1726867447.25560: iterating over new_blocks loaded from include file 25039 1726867447.25562: in VariableManager get_vars() 25039 1726867447.25581: done with get_vars() 25039 1726867447.25583: filtering new block on tags 25039 1726867447.25615: done filtering new block on tags 25039 1726867447.25618: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 25039 1726867447.25622: extending task lists for all hosts with included blocks 25039 1726867447.25769: done extending task lists 25039 1726867447.25770: done processing included files 25039 1726867447.25771: results queue empty 25039 1726867447.25772: checking for any_errors_fatal 25039 1726867447.25775: done checking for any_errors_fatal 25039 1726867447.25775: checking for max_fail_percentage 25039 1726867447.25778: done checking for max_fail_percentage 25039 1726867447.25779: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.25780: done checking to see if all hosts have failed 25039 1726867447.25780: getting the remaining hosts for this loop 25039 1726867447.25782: done getting the remaining hosts for this loop 25039 1726867447.25784: getting the next task for host managed_node1 25039 1726867447.25787: done getting next task for host managed_node1 25039 1726867447.25789: ^ task is: TASK: Ensure state in ["present", "absent"] 25039 1726867447.25791: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.25793: getting variables 25039 1726867447.25794: in VariableManager get_vars() 25039 1726867447.25806: Calling all_inventory to load vars for managed_node1 25039 1726867447.25810: Calling groups_inventory to load vars for managed_node1 25039 1726867447.25812: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.25817: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.25819: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.25821: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.25954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.26141: done with get_vars() 25039 1726867447.26150: done getting variables 25039 1726867447.26210: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 17:24:07 -0400 (0:00:00.028) 0:00:04.787 ****** 25039 1726867447.26235: entering _queue_task() for managed_node1/fail 25039 1726867447.26236: Creating lock for fail 25039 1726867447.26438: worker is 1 (out of 1 available) 25039 1726867447.26448: exiting _queue_task() for managed_node1/fail 25039 1726867447.26459: done queuing things up, now waiting for results queue to drain 25039 1726867447.26461: waiting for pending results... 25039 1726867447.26600: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 25039 1726867447.26652: in run() - task 0affcac9-a3a5-3ddc-7272-000000000156 25039 1726867447.26663: variable 'ansible_search_path' from source: unknown 25039 1726867447.26666: variable 'ansible_search_path' from source: unknown 25039 1726867447.26699: calling self._execute() 25039 1726867447.26758: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.26761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.26770: variable 'omit' from source: magic vars 25039 1726867447.27058: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.27067: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.27159: variable 'state' from source: include params 25039 1726867447.27164: Evaluated conditional (state not in ["present", "absent"]): False 25039 1726867447.27166: when evaluation is False, skipping this task 25039 1726867447.27169: _execute() done 25039 1726867447.27171: dumping result to json 25039 1726867447.27174: done dumping result, returning 25039 1726867447.27183: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0affcac9-a3a5-3ddc-7272-000000000156] 25039 1726867447.27187: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000156 25039 1726867447.27264: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000156 25039 1726867447.27267: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 25039 1726867447.27315: no more pending results, returning what we have 25039 1726867447.27318: results queue empty 25039 1726867447.27319: checking for any_errors_fatal 25039 1726867447.27320: done checking for any_errors_fatal 25039 1726867447.27321: checking for max_fail_percentage 25039 1726867447.27322: done checking for max_fail_percentage 25039 1726867447.27323: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.27324: done checking to see if all hosts have failed 25039 1726867447.27325: getting the remaining hosts for this loop 25039 1726867447.27327: done getting the remaining hosts for this loop 25039 1726867447.27329: getting the next task for host managed_node1 25039 1726867447.27333: done getting next task for host managed_node1 25039 1726867447.27335: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 25039 1726867447.27338: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.27340: getting variables 25039 1726867447.27341: in VariableManager get_vars() 25039 1726867447.27370: Calling all_inventory to load vars for managed_node1 25039 1726867447.27372: Calling groups_inventory to load vars for managed_node1 25039 1726867447.27374: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.27390: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.27393: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.27395: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.27517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.27630: done with get_vars() 25039 1726867447.27636: done getting variables 25039 1726867447.27671: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 17:24:07 -0400 (0:00:00.014) 0:00:04.802 ****** 25039 1726867447.27690: entering _queue_task() for managed_node1/fail 25039 1726867447.27847: worker is 1 (out of 1 available) 25039 1726867447.27858: exiting _queue_task() for managed_node1/fail 25039 1726867447.27868: done queuing things up, now waiting for results queue to drain 25039 1726867447.27869: waiting for pending results... 25039 1726867447.28000: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 25039 1726867447.28060: in run() - task 0affcac9-a3a5-3ddc-7272-000000000157 25039 1726867447.28070: variable 'ansible_search_path' from source: unknown 25039 1726867447.28074: variable 'ansible_search_path' from source: unknown 25039 1726867447.28099: calling self._execute() 25039 1726867447.28154: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.28158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.28166: variable 'omit' from source: magic vars 25039 1726867447.28397: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.28406: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.28498: variable 'type' from source: play vars 25039 1726867447.28504: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 25039 1726867447.28506: when evaluation is False, skipping this task 25039 1726867447.28512: _execute() done 25039 1726867447.28515: dumping result to json 25039 1726867447.28518: done dumping result, returning 25039 1726867447.28526: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcac9-a3a5-3ddc-7272-000000000157] 25039 1726867447.28529: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000157 25039 1726867447.28605: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000157 25039 1726867447.28608: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 25039 1726867447.28665: no more pending results, returning what we have 25039 1726867447.28668: results queue empty 25039 1726867447.28669: checking for any_errors_fatal 25039 1726867447.28673: done checking for any_errors_fatal 25039 1726867447.28673: checking for max_fail_percentage 25039 1726867447.28675: done checking for max_fail_percentage 25039 1726867447.28675: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.28676: done checking to see if all hosts have failed 25039 1726867447.28679: getting the remaining hosts for this loop 25039 1726867447.28680: done getting the remaining hosts for this loop 25039 1726867447.28683: getting the next task for host managed_node1 25039 1726867447.28687: done getting next task for host managed_node1 25039 1726867447.28689: ^ task is: TASK: Include the task 'show_interfaces.yml' 25039 1726867447.28692: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.28695: getting variables 25039 1726867447.28696: in VariableManager get_vars() 25039 1726867447.28724: Calling all_inventory to load vars for managed_node1 25039 1726867447.28726: Calling groups_inventory to load vars for managed_node1 25039 1726867447.28728: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.28734: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.28735: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.28737: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.28836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.29098: done with get_vars() 25039 1726867447.29104: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 17:24:07 -0400 (0:00:00.014) 0:00:04.817 ****** 25039 1726867447.29158: entering _queue_task() for managed_node1/include_tasks 25039 1726867447.29314: worker is 1 (out of 1 available) 25039 1726867447.29325: exiting _queue_task() for managed_node1/include_tasks 25039 1726867447.29336: done queuing things up, now waiting for results queue to drain 25039 1726867447.29338: waiting for pending results... 25039 1726867447.29468: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 25039 1726867447.29525: in run() - task 0affcac9-a3a5-3ddc-7272-000000000158 25039 1726867447.29534: variable 'ansible_search_path' from source: unknown 25039 1726867447.29537: variable 'ansible_search_path' from source: unknown 25039 1726867447.29563: calling self._execute() 25039 1726867447.29623: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.29626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.29634: variable 'omit' from source: magic vars 25039 1726867447.29876: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.29887: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.29893: _execute() done 25039 1726867447.29896: dumping result to json 25039 1726867447.29898: done dumping result, returning 25039 1726867447.29905: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-3ddc-7272-000000000158] 25039 1726867447.29909: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000158 25039 1726867447.29990: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000158 25039 1726867447.29993: WORKER PROCESS EXITING 25039 1726867447.30018: no more pending results, returning what we have 25039 1726867447.30026: in VariableManager get_vars() 25039 1726867447.30062: Calling all_inventory to load vars for managed_node1 25039 1726867447.30064: Calling groups_inventory to load vars for managed_node1 25039 1726867447.30066: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.30075: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.30079: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.30082: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.30195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.30309: done with get_vars() 25039 1726867447.30314: variable 'ansible_search_path' from source: unknown 25039 1726867447.30315: variable 'ansible_search_path' from source: unknown 25039 1726867447.30338: we have included files to process 25039 1726867447.30339: generating all_blocks data 25039 1726867447.30340: done generating all_blocks data 25039 1726867447.30343: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867447.30344: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867447.30345: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867447.30407: in VariableManager get_vars() 25039 1726867447.30421: done with get_vars() 25039 1726867447.30493: done processing included file 25039 1726867447.30494: iterating over new_blocks loaded from include file 25039 1726867447.30495: in VariableManager get_vars() 25039 1726867447.30507: done with get_vars() 25039 1726867447.30508: filtering new block on tags 25039 1726867447.30519: done filtering new block on tags 25039 1726867447.30520: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 25039 1726867447.30523: extending task lists for all hosts with included blocks 25039 1726867447.30752: done extending task lists 25039 1726867447.30754: done processing included files 25039 1726867447.30754: results queue empty 25039 1726867447.30755: checking for any_errors_fatal 25039 1726867447.30756: done checking for any_errors_fatal 25039 1726867447.30757: checking for max_fail_percentage 25039 1726867447.30757: done checking for max_fail_percentage 25039 1726867447.30758: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.30758: done checking to see if all hosts have failed 25039 1726867447.30759: getting the remaining hosts for this loop 25039 1726867447.30760: done getting the remaining hosts for this loop 25039 1726867447.30761: getting the next task for host managed_node1 25039 1726867447.30763: done getting next task for host managed_node1 25039 1726867447.30765: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25039 1726867447.30767: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.30769: getting variables 25039 1726867447.30769: in VariableManager get_vars() 25039 1726867447.30780: Calling all_inventory to load vars for managed_node1 25039 1726867447.30801: Calling groups_inventory to load vars for managed_node1 25039 1726867447.30802: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.30806: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.30807: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.30810: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.30888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.30995: done with get_vars() 25039 1726867447.31001: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:24:07 -0400 (0:00:00.018) 0:00:04.836 ****** 25039 1726867447.31045: entering _queue_task() for managed_node1/include_tasks 25039 1726867447.31211: worker is 1 (out of 1 available) 25039 1726867447.31222: exiting _queue_task() for managed_node1/include_tasks 25039 1726867447.31234: done queuing things up, now waiting for results queue to drain 25039 1726867447.31236: waiting for pending results... 25039 1726867447.31369: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 25039 1726867447.31434: in run() - task 0affcac9-a3a5-3ddc-7272-00000000017f 25039 1726867447.31443: variable 'ansible_search_path' from source: unknown 25039 1726867447.31447: variable 'ansible_search_path' from source: unknown 25039 1726867447.31474: calling self._execute() 25039 1726867447.31533: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.31536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.31545: variable 'omit' from source: magic vars 25039 1726867447.31782: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.31795: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.31798: _execute() done 25039 1726867447.31801: dumping result to json 25039 1726867447.31803: done dumping result, returning 25039 1726867447.31812: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-3ddc-7272-00000000017f] 25039 1726867447.31817: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000017f 25039 1726867447.31894: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000017f 25039 1726867447.31897: WORKER PROCESS EXITING 25039 1726867447.31922: no more pending results, returning what we have 25039 1726867447.31929: in VariableManager get_vars() 25039 1726867447.31964: Calling all_inventory to load vars for managed_node1 25039 1726867447.31966: Calling groups_inventory to load vars for managed_node1 25039 1726867447.31968: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.31979: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.31981: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.31984: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.32095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.32221: done with get_vars() 25039 1726867447.32226: variable 'ansible_search_path' from source: unknown 25039 1726867447.32227: variable 'ansible_search_path' from source: unknown 25039 1726867447.32261: we have included files to process 25039 1726867447.32262: generating all_blocks data 25039 1726867447.32263: done generating all_blocks data 25039 1726867447.32264: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867447.32264: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867447.32265: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867447.32424: done processing included file 25039 1726867447.32425: iterating over new_blocks loaded from include file 25039 1726867447.32426: in VariableManager get_vars() 25039 1726867447.32437: done with get_vars() 25039 1726867447.32438: filtering new block on tags 25039 1726867447.32449: done filtering new block on tags 25039 1726867447.32452: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 25039 1726867447.32455: extending task lists for all hosts with included blocks 25039 1726867447.32538: done extending task lists 25039 1726867447.32539: done processing included files 25039 1726867447.32539: results queue empty 25039 1726867447.32540: checking for any_errors_fatal 25039 1726867447.32541: done checking for any_errors_fatal 25039 1726867447.32542: checking for max_fail_percentage 25039 1726867447.32542: done checking for max_fail_percentage 25039 1726867447.32543: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.32544: done checking to see if all hosts have failed 25039 1726867447.32544: getting the remaining hosts for this loop 25039 1726867447.32545: done getting the remaining hosts for this loop 25039 1726867447.32546: getting the next task for host managed_node1 25039 1726867447.32549: done getting next task for host managed_node1 25039 1726867447.32550: ^ task is: TASK: Gather current interface info 25039 1726867447.32552: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.32553: getting variables 25039 1726867447.32554: in VariableManager get_vars() 25039 1726867447.32564: Calling all_inventory to load vars for managed_node1 25039 1726867447.32565: Calling groups_inventory to load vars for managed_node1 25039 1726867447.32566: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.32570: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.32571: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.32572: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.32650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.32759: done with get_vars() 25039 1726867447.32765: done getting variables 25039 1726867447.32794: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:24:07 -0400 (0:00:00.017) 0:00:04.853 ****** 25039 1726867447.32812: entering _queue_task() for managed_node1/command 25039 1726867447.32971: worker is 1 (out of 1 available) 25039 1726867447.32985: exiting _queue_task() for managed_node1/command 25039 1726867447.32996: done queuing things up, now waiting for results queue to drain 25039 1726867447.32998: waiting for pending results... 25039 1726867447.33133: running TaskExecutor() for managed_node1/TASK: Gather current interface info 25039 1726867447.33191: in run() - task 0affcac9-a3a5-3ddc-7272-0000000001b6 25039 1726867447.33201: variable 'ansible_search_path' from source: unknown 25039 1726867447.33205: variable 'ansible_search_path' from source: unknown 25039 1726867447.33233: calling self._execute() 25039 1726867447.33287: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.33291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.33299: variable 'omit' from source: magic vars 25039 1726867447.33568: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.33578: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.33584: variable 'omit' from source: magic vars 25039 1726867447.33618: variable 'omit' from source: magic vars 25039 1726867447.33642: variable 'omit' from source: magic vars 25039 1726867447.33672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867447.33699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867447.33716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867447.33730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.33739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.33760: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867447.33765: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.33767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.33834: Set connection var ansible_shell_executable to /bin/sh 25039 1726867447.33837: Set connection var ansible_timeout to 10 25039 1726867447.33844: Set connection var ansible_shell_type to sh 25039 1726867447.33846: Set connection var ansible_connection to ssh 25039 1726867447.33852: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867447.33857: Set connection var ansible_pipelining to False 25039 1726867447.33874: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.33880: variable 'ansible_connection' from source: unknown 25039 1726867447.33884: variable 'ansible_module_compression' from source: unknown 25039 1726867447.33886: variable 'ansible_shell_type' from source: unknown 25039 1726867447.33888: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.33891: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.33893: variable 'ansible_pipelining' from source: unknown 25039 1726867447.33895: variable 'ansible_timeout' from source: unknown 25039 1726867447.33897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.33987: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867447.33994: variable 'omit' from source: magic vars 25039 1726867447.33999: starting attempt loop 25039 1726867447.34002: running the handler 25039 1726867447.34018: _low_level_execute_command(): starting 25039 1726867447.34025: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867447.34520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.34523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.34526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.34528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.34582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867447.34585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.34641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.36298: stdout chunk (state=3): >>>/root <<< 25039 1726867447.36400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.36425: stderr chunk (state=3): >>><<< 25039 1726867447.36428: stdout chunk (state=3): >>><<< 25039 1726867447.36446: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867447.36457: _low_level_execute_command(): starting 25039 1726867447.36460: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682 `" && echo ansible-tmp-1726867447.36445-25306-266535095860682="` echo /root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682 `" ) && sleep 0' 25039 1726867447.36887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.36890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.36894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867447.36902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867447.36905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.36950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867447.36957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.37002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.38869: stdout chunk (state=3): >>>ansible-tmp-1726867447.36445-25306-266535095860682=/root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682 <<< 25039 1726867447.38976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.39004: stderr chunk (state=3): >>><<< 25039 1726867447.39010: stdout chunk (state=3): >>><<< 25039 1726867447.39022: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867447.36445-25306-266535095860682=/root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867447.39046: variable 'ansible_module_compression' from source: unknown 25039 1726867447.39086: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867447.39119: variable 'ansible_facts' from source: unknown 25039 1726867447.39173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/AnsiballZ_command.py 25039 1726867447.39268: Sending initial data 25039 1726867447.39271: Sent initial data (154 bytes) 25039 1726867447.39706: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.39712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.39715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867447.39717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867447.39719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.39768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867447.39773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.39816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.41337: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867447.41343: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867447.41379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867447.41423: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpur5x15ro /root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/AnsiballZ_command.py <<< 25039 1726867447.41427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/AnsiballZ_command.py" <<< 25039 1726867447.41474: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 25039 1726867447.41478: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpur5x15ro" to remote "/root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/AnsiballZ_command.py" <<< 25039 1726867447.42006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.42047: stderr chunk (state=3): >>><<< 25039 1726867447.42050: stdout chunk (state=3): >>><<< 25039 1726867447.42089: done transferring module to remote 25039 1726867447.42098: _low_level_execute_command(): starting 25039 1726867447.42102: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/ /root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/AnsiballZ_command.py && sleep 0' 25039 1726867447.42525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.42528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867447.42531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.42533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867447.42539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.42582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867447.42588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.42635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.44376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.44401: stderr chunk (state=3): >>><<< 25039 1726867447.44404: stdout chunk (state=3): >>><<< 25039 1726867447.44417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867447.44420: _low_level_execute_command(): starting 25039 1726867447.44423: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/AnsiballZ_command.py && sleep 0' 25039 1726867447.44820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867447.44823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867447.44825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867447.44828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867447.44829: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.44871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867447.44889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.44937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.60299: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:24:07.598020", "end": "2024-09-20 17:24:07.601171", "delta": "0:00:00.003151", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867447.61748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867447.61771: stderr chunk (state=3): >>><<< 25039 1726867447.61775: stdout chunk (state=3): >>><<< 25039 1726867447.61794: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:24:07.598020", "end": "2024-09-20 17:24:07.601171", "delta": "0:00:00.003151", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867447.61827: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867447.61834: _low_level_execute_command(): starting 25039 1726867447.61839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867447.36445-25306-266535095860682/ > /dev/null 2>&1 && sleep 0' 25039 1726867447.62269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.62273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.62275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.62283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.62330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867447.62334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.62382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.64165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.64187: stderr chunk (state=3): >>><<< 25039 1726867447.64190: stdout chunk (state=3): >>><<< 25039 1726867447.64202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867447.64210: handler run complete 25039 1726867447.64231: Evaluated conditional (False): False 25039 1726867447.64236: attempt loop complete, returning result 25039 1726867447.64239: _execute() done 25039 1726867447.64241: dumping result to json 25039 1726867447.64247: done dumping result, returning 25039 1726867447.64254: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affcac9-a3a5-3ddc-7272-0000000001b6] 25039 1726867447.64258: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001b6 25039 1726867447.64358: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001b6 25039 1726867447.64361: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003151", "end": "2024-09-20 17:24:07.601171", "rc": 0, "start": "2024-09-20 17:24:07.598020" } STDOUT: bonding_masters eth0 lo 25039 1726867447.64435: no more pending results, returning what we have 25039 1726867447.64438: results queue empty 25039 1726867447.64439: checking for any_errors_fatal 25039 1726867447.64441: done checking for any_errors_fatal 25039 1726867447.64441: checking for max_fail_percentage 25039 1726867447.64443: done checking for max_fail_percentage 25039 1726867447.64443: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.64444: done checking to see if all hosts have failed 25039 1726867447.64445: getting the remaining hosts for this loop 25039 1726867447.64446: done getting the remaining hosts for this loop 25039 1726867447.64449: getting the next task for host managed_node1 25039 1726867447.64457: done getting next task for host managed_node1 25039 1726867447.64459: ^ task is: TASK: Set current_interfaces 25039 1726867447.64464: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.64468: getting variables 25039 1726867447.64471: in VariableManager get_vars() 25039 1726867447.64562: Calling all_inventory to load vars for managed_node1 25039 1726867447.64564: Calling groups_inventory to load vars for managed_node1 25039 1726867447.64566: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.64576: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.64586: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.64589: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.64695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.64818: done with get_vars() 25039 1726867447.64825: done getting variables 25039 1726867447.64867: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:24:07 -0400 (0:00:00.320) 0:00:05.174 ****** 25039 1726867447.64890: entering _queue_task() for managed_node1/set_fact 25039 1726867447.65079: worker is 1 (out of 1 available) 25039 1726867447.65090: exiting _queue_task() for managed_node1/set_fact 25039 1726867447.65101: done queuing things up, now waiting for results queue to drain 25039 1726867447.65103: waiting for pending results... 25039 1726867447.65257: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 25039 1726867447.65333: in run() - task 0affcac9-a3a5-3ddc-7272-0000000001b7 25039 1726867447.65341: variable 'ansible_search_path' from source: unknown 25039 1726867447.65344: variable 'ansible_search_path' from source: unknown 25039 1726867447.65373: calling self._execute() 25039 1726867447.65437: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.65441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.65451: variable 'omit' from source: magic vars 25039 1726867447.65709: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.65718: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.65723: variable 'omit' from source: magic vars 25039 1726867447.65755: variable 'omit' from source: magic vars 25039 1726867447.65832: variable '_current_interfaces' from source: set_fact 25039 1726867447.65874: variable 'omit' from source: magic vars 25039 1726867447.65910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867447.65935: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867447.65950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867447.65963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.65973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.66003: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867447.66006: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.66011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.66071: Set connection var ansible_shell_executable to /bin/sh 25039 1726867447.66079: Set connection var ansible_timeout to 10 25039 1726867447.66084: Set connection var ansible_shell_type to sh 25039 1726867447.66087: Set connection var ansible_connection to ssh 25039 1726867447.66095: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867447.66098: Set connection var ansible_pipelining to False 25039 1726867447.66121: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.66124: variable 'ansible_connection' from source: unknown 25039 1726867447.66127: variable 'ansible_module_compression' from source: unknown 25039 1726867447.66129: variable 'ansible_shell_type' from source: unknown 25039 1726867447.66131: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.66133: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.66136: variable 'ansible_pipelining' from source: unknown 25039 1726867447.66138: variable 'ansible_timeout' from source: unknown 25039 1726867447.66142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.66239: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867447.66246: variable 'omit' from source: magic vars 25039 1726867447.66251: starting attempt loop 25039 1726867447.66254: running the handler 25039 1726867447.66262: handler run complete 25039 1726867447.66271: attempt loop complete, returning result 25039 1726867447.66273: _execute() done 25039 1726867447.66276: dumping result to json 25039 1726867447.66279: done dumping result, returning 25039 1726867447.66287: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affcac9-a3a5-3ddc-7272-0000000001b7] 25039 1726867447.66289: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001b7 25039 1726867447.66366: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001b7 25039 1726867447.66369: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 25039 1726867447.66424: no more pending results, returning what we have 25039 1726867447.66427: results queue empty 25039 1726867447.66428: checking for any_errors_fatal 25039 1726867447.66434: done checking for any_errors_fatal 25039 1726867447.66434: checking for max_fail_percentage 25039 1726867447.66436: done checking for max_fail_percentage 25039 1726867447.66436: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.66437: done checking to see if all hosts have failed 25039 1726867447.66438: getting the remaining hosts for this loop 25039 1726867447.66439: done getting the remaining hosts for this loop 25039 1726867447.66442: getting the next task for host managed_node1 25039 1726867447.66448: done getting next task for host managed_node1 25039 1726867447.66450: ^ task is: TASK: Show current_interfaces 25039 1726867447.66454: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.66457: getting variables 25039 1726867447.66458: in VariableManager get_vars() 25039 1726867447.66489: Calling all_inventory to load vars for managed_node1 25039 1726867447.66491: Calling groups_inventory to load vars for managed_node1 25039 1726867447.66493: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.66501: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.66503: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.66506: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.66616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.66752: done with get_vars() 25039 1726867447.66759: done getting variables 25039 1726867447.66799: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:24:07 -0400 (0:00:00.019) 0:00:05.193 ****** 25039 1726867447.66821: entering _queue_task() for managed_node1/debug 25039 1726867447.66998: worker is 1 (out of 1 available) 25039 1726867447.67013: exiting _queue_task() for managed_node1/debug 25039 1726867447.67026: done queuing things up, now waiting for results queue to drain 25039 1726867447.67027: waiting for pending results... 25039 1726867447.67155: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 25039 1726867447.67218: in run() - task 0affcac9-a3a5-3ddc-7272-000000000180 25039 1726867447.67228: variable 'ansible_search_path' from source: unknown 25039 1726867447.67232: variable 'ansible_search_path' from source: unknown 25039 1726867447.67259: calling self._execute() 25039 1726867447.67318: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.67322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.67330: variable 'omit' from source: magic vars 25039 1726867447.67568: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.67581: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.67587: variable 'omit' from source: magic vars 25039 1726867447.67620: variable 'omit' from source: magic vars 25039 1726867447.67685: variable 'current_interfaces' from source: set_fact 25039 1726867447.67706: variable 'omit' from source: magic vars 25039 1726867447.67735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867447.67759: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867447.67775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867447.67792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.67799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.67826: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867447.67830: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.67832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.67892: Set connection var ansible_shell_executable to /bin/sh 25039 1726867447.67897: Set connection var ansible_timeout to 10 25039 1726867447.67903: Set connection var ansible_shell_type to sh 25039 1726867447.67905: Set connection var ansible_connection to ssh 25039 1726867447.67912: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867447.67917: Set connection var ansible_pipelining to False 25039 1726867447.67938: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.67942: variable 'ansible_connection' from source: unknown 25039 1726867447.67944: variable 'ansible_module_compression' from source: unknown 25039 1726867447.67946: variable 'ansible_shell_type' from source: unknown 25039 1726867447.67949: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.67951: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.67953: variable 'ansible_pipelining' from source: unknown 25039 1726867447.67957: variable 'ansible_timeout' from source: unknown 25039 1726867447.67961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.68053: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867447.68061: variable 'omit' from source: magic vars 25039 1726867447.68064: starting attempt loop 25039 1726867447.68067: running the handler 25039 1726867447.68103: handler run complete 25039 1726867447.68114: attempt loop complete, returning result 25039 1726867447.68118: _execute() done 25039 1726867447.68121: dumping result to json 25039 1726867447.68123: done dumping result, returning 25039 1726867447.68129: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affcac9-a3a5-3ddc-7272-000000000180] 25039 1726867447.68131: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000180 25039 1726867447.68212: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000180 25039 1726867447.68215: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 25039 1726867447.68272: no more pending results, returning what we have 25039 1726867447.68275: results queue empty 25039 1726867447.68276: checking for any_errors_fatal 25039 1726867447.68283: done checking for any_errors_fatal 25039 1726867447.68284: checking for max_fail_percentage 25039 1726867447.68285: done checking for max_fail_percentage 25039 1726867447.68286: checking to see if all hosts have failed and the running result is not ok 25039 1726867447.68286: done checking to see if all hosts have failed 25039 1726867447.68287: getting the remaining hosts for this loop 25039 1726867447.68288: done getting the remaining hosts for this loop 25039 1726867447.68291: getting the next task for host managed_node1 25039 1726867447.68297: done getting next task for host managed_node1 25039 1726867447.68299: ^ task is: TASK: Install iproute 25039 1726867447.68301: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867447.68305: getting variables 25039 1726867447.68306: in VariableManager get_vars() 25039 1726867447.68337: Calling all_inventory to load vars for managed_node1 25039 1726867447.68339: Calling groups_inventory to load vars for managed_node1 25039 1726867447.68341: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867447.68347: Calling all_plugins_play to load vars for managed_node1 25039 1726867447.68349: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867447.68351: Calling groups_plugins_play to load vars for managed_node1 25039 1726867447.68454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867447.68569: done with get_vars() 25039 1726867447.68575: done getting variables 25039 1726867447.68614: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 17:24:07 -0400 (0:00:00.018) 0:00:05.211 ****** 25039 1726867447.68632: entering _queue_task() for managed_node1/package 25039 1726867447.68799: worker is 1 (out of 1 available) 25039 1726867447.68815: exiting _queue_task() for managed_node1/package 25039 1726867447.68828: done queuing things up, now waiting for results queue to drain 25039 1726867447.68829: waiting for pending results... 25039 1726867447.68963: running TaskExecutor() for managed_node1/TASK: Install iproute 25039 1726867447.69020: in run() - task 0affcac9-a3a5-3ddc-7272-000000000159 25039 1726867447.69031: variable 'ansible_search_path' from source: unknown 25039 1726867447.69035: variable 'ansible_search_path' from source: unknown 25039 1726867447.69059: calling self._execute() 25039 1726867447.69116: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.69119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.69127: variable 'omit' from source: magic vars 25039 1726867447.69358: variable 'ansible_distribution_major_version' from source: facts 25039 1726867447.69367: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867447.69374: variable 'omit' from source: magic vars 25039 1726867447.69400: variable 'omit' from source: magic vars 25039 1726867447.69522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867447.70878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867447.70920: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867447.70948: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867447.70975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867447.71009: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867447.71070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867447.71093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867447.71115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867447.71138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867447.71149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867447.71221: variable '__network_is_ostree' from source: set_fact 25039 1726867447.71225: variable 'omit' from source: magic vars 25039 1726867447.71245: variable 'omit' from source: magic vars 25039 1726867447.71265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867447.71287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867447.71302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867447.71315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.71323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867447.71346: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867447.71349: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.71351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.71418: Set connection var ansible_shell_executable to /bin/sh 25039 1726867447.71422: Set connection var ansible_timeout to 10 25039 1726867447.71428: Set connection var ansible_shell_type to sh 25039 1726867447.71431: Set connection var ansible_connection to ssh 25039 1726867447.71437: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867447.71443: Set connection var ansible_pipelining to False 25039 1726867447.71459: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.71462: variable 'ansible_connection' from source: unknown 25039 1726867447.71464: variable 'ansible_module_compression' from source: unknown 25039 1726867447.71466: variable 'ansible_shell_type' from source: unknown 25039 1726867447.71469: variable 'ansible_shell_executable' from source: unknown 25039 1726867447.71471: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867447.71476: variable 'ansible_pipelining' from source: unknown 25039 1726867447.71480: variable 'ansible_timeout' from source: unknown 25039 1726867447.71483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867447.71548: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867447.71556: variable 'omit' from source: magic vars 25039 1726867447.71564: starting attempt loop 25039 1726867447.71567: running the handler 25039 1726867447.71572: variable 'ansible_facts' from source: unknown 25039 1726867447.71575: variable 'ansible_facts' from source: unknown 25039 1726867447.71603: _low_level_execute_command(): starting 25039 1726867447.71613: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867447.72101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.72105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.72110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.72112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.72158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867447.72161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867447.72163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.72217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.73834: stdout chunk (state=3): >>>/root <<< 25039 1726867447.73940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.73960: stderr chunk (state=3): >>><<< 25039 1726867447.73964: stdout chunk (state=3): >>><<< 25039 1726867447.73983: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867447.73992: _low_level_execute_command(): starting 25039 1726867447.73998: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484 `" && echo ansible-tmp-1726867447.7398183-25323-257939150837484="` echo /root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484 `" ) && sleep 0' 25039 1726867447.74412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867447.74415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867447.74418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.74420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.74422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.74472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867447.74483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.74527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.76387: stdout chunk (state=3): >>>ansible-tmp-1726867447.7398183-25323-257939150837484=/root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484 <<< 25039 1726867447.76494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.76521: stderr chunk (state=3): >>><<< 25039 1726867447.76524: stdout chunk (state=3): >>><<< 25039 1726867447.76537: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867447.7398183-25323-257939150837484=/root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867447.76558: variable 'ansible_module_compression' from source: unknown 25039 1726867447.76602: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 25039 1726867447.76606: ANSIBALLZ: Acquiring lock 25039 1726867447.76608: ANSIBALLZ: Lock acquired: 140682442827552 25039 1726867447.76610: ANSIBALLZ: Creating module 25039 1726867447.87201: ANSIBALLZ: Writing module into payload 25039 1726867447.87335: ANSIBALLZ: Writing module 25039 1726867447.87352: ANSIBALLZ: Renaming module 25039 1726867447.87363: ANSIBALLZ: Done creating module 25039 1726867447.87379: variable 'ansible_facts' from source: unknown 25039 1726867447.87447: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/AnsiballZ_dnf.py 25039 1726867447.87542: Sending initial data 25039 1726867447.87546: Sent initial data (152 bytes) 25039 1726867447.87974: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867447.88009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867447.88012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867447.88014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.88017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867447.88019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867447.88020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.88073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867447.88076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867447.88084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.88132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.89781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867447.89836: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867447.89881: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmps5dckhrk /root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/AnsiballZ_dnf.py <<< 25039 1726867447.89884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/AnsiballZ_dnf.py" <<< 25039 1726867447.89938: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmps5dckhrk" to remote "/root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/AnsiballZ_dnf.py" <<< 25039 1726867447.89941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/AnsiballZ_dnf.py" <<< 25039 1726867447.90610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.90647: stderr chunk (state=3): >>><<< 25039 1726867447.90650: stdout chunk (state=3): >>><<< 25039 1726867447.90689: done transferring module to remote 25039 1726867447.90697: _low_level_execute_command(): starting 25039 1726867447.90702: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/ /root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/AnsiballZ_dnf.py && sleep 0' 25039 1726867447.91273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867447.91279: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867447.91282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.91284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.91316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.91360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867447.93112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867447.93139: stderr chunk (state=3): >>><<< 25039 1726867447.93142: stdout chunk (state=3): >>><<< 25039 1726867447.93156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867447.93159: _low_level_execute_command(): starting 25039 1726867447.93162: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/AnsiballZ_dnf.py && sleep 0' 25039 1726867447.93551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867447.93589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867447.93592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867447.93594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867447.93596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867447.93598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867447.93641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867447.93644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867447.93705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.34379: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25039 1726867448.38472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867448.38504: stderr chunk (state=3): >>><<< 25039 1726867448.38507: stdout chunk (state=3): >>><<< 25039 1726867448.38527: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867448.38563: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867448.38569: _low_level_execute_command(): starting 25039 1726867448.38574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867447.7398183-25323-257939150837484/ > /dev/null 2>&1 && sleep 0' 25039 1726867448.39035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867448.39038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.39042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867448.39044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.39098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.39101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.39152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.40954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.40985: stderr chunk (state=3): >>><<< 25039 1726867448.40988: stdout chunk (state=3): >>><<< 25039 1726867448.41000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867448.41009: handler run complete 25039 1726867448.41122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867448.41246: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867448.41276: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867448.41301: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867448.41325: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867448.41379: variable '__install_status' from source: unknown 25039 1726867448.41393: Evaluated conditional (__install_status is success): True 25039 1726867448.41407: attempt loop complete, returning result 25039 1726867448.41413: _execute() done 25039 1726867448.41416: dumping result to json 25039 1726867448.41418: done dumping result, returning 25039 1726867448.41425: done running TaskExecutor() for managed_node1/TASK: Install iproute [0affcac9-a3a5-3ddc-7272-000000000159] 25039 1726867448.41429: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000159 25039 1726867448.41525: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000159 25039 1726867448.41530: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25039 1726867448.41644: no more pending results, returning what we have 25039 1726867448.41648: results queue empty 25039 1726867448.41649: checking for any_errors_fatal 25039 1726867448.41654: done checking for any_errors_fatal 25039 1726867448.41654: checking for max_fail_percentage 25039 1726867448.41656: done checking for max_fail_percentage 25039 1726867448.41656: checking to see if all hosts have failed and the running result is not ok 25039 1726867448.41657: done checking to see if all hosts have failed 25039 1726867448.41658: getting the remaining hosts for this loop 25039 1726867448.41659: done getting the remaining hosts for this loop 25039 1726867448.41663: getting the next task for host managed_node1 25039 1726867448.41668: done getting next task for host managed_node1 25039 1726867448.41670: ^ task is: TASK: Create veth interface {{ interface }} 25039 1726867448.41673: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867448.41676: getting variables 25039 1726867448.41679: in VariableManager get_vars() 25039 1726867448.41716: Calling all_inventory to load vars for managed_node1 25039 1726867448.41718: Calling groups_inventory to load vars for managed_node1 25039 1726867448.41720: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867448.41729: Calling all_plugins_play to load vars for managed_node1 25039 1726867448.41731: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867448.41734: Calling groups_plugins_play to load vars for managed_node1 25039 1726867448.41913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867448.42032: done with get_vars() 25039 1726867448.42040: done getting variables 25039 1726867448.42083: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867448.42173: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 17:24:08 -0400 (0:00:00.735) 0:00:05.947 ****** 25039 1726867448.42214: entering _queue_task() for managed_node1/command 25039 1726867448.42419: worker is 1 (out of 1 available) 25039 1726867448.42431: exiting _queue_task() for managed_node1/command 25039 1726867448.42443: done queuing things up, now waiting for results queue to drain 25039 1726867448.42444: waiting for pending results... 25039 1726867448.42593: running TaskExecutor() for managed_node1/TASK: Create veth interface veth0 25039 1726867448.42655: in run() - task 0affcac9-a3a5-3ddc-7272-00000000015a 25039 1726867448.42667: variable 'ansible_search_path' from source: unknown 25039 1726867448.42671: variable 'ansible_search_path' from source: unknown 25039 1726867448.42860: variable 'interface' from source: play vars 25039 1726867448.42921: variable 'interface' from source: play vars 25039 1726867448.42971: variable 'interface' from source: play vars 25039 1726867448.43076: Loaded config def from plugin (lookup/items) 25039 1726867448.43082: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 25039 1726867448.43099: variable 'omit' from source: magic vars 25039 1726867448.43179: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867448.43187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867448.43196: variable 'omit' from source: magic vars 25039 1726867448.43351: variable 'ansible_distribution_major_version' from source: facts 25039 1726867448.43358: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867448.43481: variable 'type' from source: play vars 25039 1726867448.43485: variable 'state' from source: include params 25039 1726867448.43490: variable 'interface' from source: play vars 25039 1726867448.43492: variable 'current_interfaces' from source: set_fact 25039 1726867448.43500: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25039 1726867448.43506: variable 'omit' from source: magic vars 25039 1726867448.43531: variable 'omit' from source: magic vars 25039 1726867448.43564: variable 'item' from source: unknown 25039 1726867448.43614: variable 'item' from source: unknown 25039 1726867448.43625: variable 'omit' from source: magic vars 25039 1726867448.43651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867448.43675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867448.43692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867448.43705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867448.43714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867448.43736: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867448.43739: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867448.43742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867448.43817: Set connection var ansible_shell_executable to /bin/sh 25039 1726867448.43822: Set connection var ansible_timeout to 10 25039 1726867448.43828: Set connection var ansible_shell_type to sh 25039 1726867448.43830: Set connection var ansible_connection to ssh 25039 1726867448.43837: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867448.43841: Set connection var ansible_pipelining to False 25039 1726867448.43857: variable 'ansible_shell_executable' from source: unknown 25039 1726867448.43859: variable 'ansible_connection' from source: unknown 25039 1726867448.43862: variable 'ansible_module_compression' from source: unknown 25039 1726867448.43864: variable 'ansible_shell_type' from source: unknown 25039 1726867448.43866: variable 'ansible_shell_executable' from source: unknown 25039 1726867448.43869: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867448.43875: variable 'ansible_pipelining' from source: unknown 25039 1726867448.43880: variable 'ansible_timeout' from source: unknown 25039 1726867448.43885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867448.43974: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867448.43984: variable 'omit' from source: magic vars 25039 1726867448.43987: starting attempt loop 25039 1726867448.43991: running the handler 25039 1726867448.44005: _low_level_execute_command(): starting 25039 1726867448.44014: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867448.44513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867448.44517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.44520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867448.44522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.44574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.44581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867448.44583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.44633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.46241: stdout chunk (state=3): >>>/root <<< 25039 1726867448.46334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.46371: stderr chunk (state=3): >>><<< 25039 1726867448.46374: stdout chunk (state=3): >>><<< 25039 1726867448.46399: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867448.46416: _low_level_execute_command(): starting 25039 1726867448.46423: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668 `" && echo ansible-tmp-1726867448.4639904-25341-221203993162668="` echo /root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668 `" ) && sleep 0' 25039 1726867448.46890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867448.46895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867448.46898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.46902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867448.46904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.46953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.46957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867448.46962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.47010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.48881: stdout chunk (state=3): >>>ansible-tmp-1726867448.4639904-25341-221203993162668=/root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668 <<< 25039 1726867448.48988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.49012: stderr chunk (state=3): >>><<< 25039 1726867448.49015: stdout chunk (state=3): >>><<< 25039 1726867448.49030: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867448.4639904-25341-221203993162668=/root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867448.49054: variable 'ansible_module_compression' from source: unknown 25039 1726867448.49098: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867448.49128: variable 'ansible_facts' from source: unknown 25039 1726867448.49186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/AnsiballZ_command.py 25039 1726867448.49282: Sending initial data 25039 1726867448.49287: Sent initial data (156 bytes) 25039 1726867448.49714: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867448.49717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867448.49719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867448.49721: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867448.49723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.49760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.49774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.49835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.51383: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25039 1726867448.51412: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867448.51462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867448.51526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp19mmxfc6 /root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/AnsiballZ_command.py <<< 25039 1726867448.51530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/AnsiballZ_command.py" <<< 25039 1726867448.51578: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp19mmxfc6" to remote "/root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/AnsiballZ_command.py" <<< 25039 1726867448.52444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.52448: stdout chunk (state=3): >>><<< 25039 1726867448.52451: stderr chunk (state=3): >>><<< 25039 1726867448.52462: done transferring module to remote 25039 1726867448.52481: _low_level_execute_command(): starting 25039 1726867448.52493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/ /root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/AnsiballZ_command.py && sleep 0' 25039 1726867448.53109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867448.53193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.53232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.53249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867448.53272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.53355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.55081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.55106: stderr chunk (state=3): >>><<< 25039 1726867448.55110: stdout chunk (state=3): >>><<< 25039 1726867448.55124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867448.55128: _low_level_execute_command(): starting 25039 1726867448.55132: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/AnsiballZ_command.py && sleep 0' 25039 1726867448.55531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867448.55534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.55537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867448.55539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.55583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.55594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.55641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.71182: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 17:24:08.703590", "end": "2024-09-20 17:24:08.709013", "delta": "0:00:00.005423", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867448.73482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867448.73514: stderr chunk (state=3): >>><<< 25039 1726867448.73517: stdout chunk (state=3): >>><<< 25039 1726867448.73531: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 17:24:08.703590", "end": "2024-09-20 17:24:08.709013", "delta": "0:00:00.005423", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867448.73562: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867448.73571: _low_level_execute_command(): starting 25039 1726867448.73574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867448.4639904-25341-221203993162668/ > /dev/null 2>&1 && sleep 0' 25039 1726867448.74024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867448.74029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867448.74032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.74035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867448.74037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867448.74039: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.74094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.74097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867448.74098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.74188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.79075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.79100: stderr chunk (state=3): >>><<< 25039 1726867448.79104: stdout chunk (state=3): >>><<< 25039 1726867448.79119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867448.79125: handler run complete 25039 1726867448.79143: Evaluated conditional (False): False 25039 1726867448.79150: attempt loop complete, returning result 25039 1726867448.79166: variable 'item' from source: unknown 25039 1726867448.79227: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.005423", "end": "2024-09-20 17:24:08.709013", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 17:24:08.703590" } 25039 1726867448.79396: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867448.79399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867448.79401: variable 'omit' from source: magic vars 25039 1726867448.79471: variable 'ansible_distribution_major_version' from source: facts 25039 1726867448.79474: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867448.79631: variable 'type' from source: play vars 25039 1726867448.79634: variable 'state' from source: include params 25039 1726867448.79637: variable 'interface' from source: play vars 25039 1726867448.79639: variable 'current_interfaces' from source: set_fact 25039 1726867448.79646: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25039 1726867448.79649: variable 'omit' from source: magic vars 25039 1726867448.79662: variable 'omit' from source: magic vars 25039 1726867448.79690: variable 'item' from source: unknown 25039 1726867448.79740: variable 'item' from source: unknown 25039 1726867448.79748: variable 'omit' from source: magic vars 25039 1726867448.79764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867448.79771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867448.79779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867448.79789: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867448.79792: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867448.79794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867448.79844: Set connection var ansible_shell_executable to /bin/sh 25039 1726867448.79855: Set connection var ansible_timeout to 10 25039 1726867448.79860: Set connection var ansible_shell_type to sh 25039 1726867448.79862: Set connection var ansible_connection to ssh 25039 1726867448.79868: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867448.79873: Set connection var ansible_pipelining to False 25039 1726867448.79890: variable 'ansible_shell_executable' from source: unknown 25039 1726867448.79893: variable 'ansible_connection' from source: unknown 25039 1726867448.79895: variable 'ansible_module_compression' from source: unknown 25039 1726867448.79897: variable 'ansible_shell_type' from source: unknown 25039 1726867448.79900: variable 'ansible_shell_executable' from source: unknown 25039 1726867448.79902: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867448.79906: variable 'ansible_pipelining' from source: unknown 25039 1726867448.79911: variable 'ansible_timeout' from source: unknown 25039 1726867448.79913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867448.79979: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867448.79988: variable 'omit' from source: magic vars 25039 1726867448.79991: starting attempt loop 25039 1726867448.79993: running the handler 25039 1726867448.80000: _low_level_execute_command(): starting 25039 1726867448.80003: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867448.80436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867448.80444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.80446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867448.80448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867448.80450: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.80495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.80498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.80553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.82140: stdout chunk (state=3): >>>/root <<< 25039 1726867448.82240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.82263: stderr chunk (state=3): >>><<< 25039 1726867448.82266: stdout chunk (state=3): >>><<< 25039 1726867448.82281: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867448.82289: _low_level_execute_command(): starting 25039 1726867448.82292: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866 `" && echo ansible-tmp-1726867448.822803-25341-41288478035866="` echo /root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866 `" ) && sleep 0' 25039 1726867448.82683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867448.82688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867448.82702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.82751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.82754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.82809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.84657: stdout chunk (state=3): >>>ansible-tmp-1726867448.822803-25341-41288478035866=/root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866 <<< 25039 1726867448.84762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.84784: stderr chunk (state=3): >>><<< 25039 1726867448.84787: stdout chunk (state=3): >>><<< 25039 1726867448.84800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867448.822803-25341-41288478035866=/root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867448.84822: variable 'ansible_module_compression' from source: unknown 25039 1726867448.84849: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867448.84864: variable 'ansible_facts' from source: unknown 25039 1726867448.84908: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/AnsiballZ_command.py 25039 1726867448.84994: Sending initial data 25039 1726867448.84997: Sent initial data (154 bytes) 25039 1726867448.85425: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867448.85428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867448.85430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.85432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867448.85434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867448.85436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.85483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.85495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.85537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.87042: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 25039 1726867448.87046: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867448.87086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867448.87134: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp1_e44fy4 /root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/AnsiballZ_command.py <<< 25039 1726867448.87137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/AnsiballZ_command.py" <<< 25039 1726867448.87179: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp1_e44fy4" to remote "/root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/AnsiballZ_command.py" <<< 25039 1726867448.87186: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/AnsiballZ_command.py" <<< 25039 1726867448.87952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.87955: stdout chunk (state=3): >>><<< 25039 1726867448.87957: stderr chunk (state=3): >>><<< 25039 1726867448.87959: done transferring module to remote 25039 1726867448.87961: _low_level_execute_command(): starting 25039 1726867448.87963: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/ /root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/AnsiballZ_command.py && sleep 0' 25039 1726867448.88535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867448.88595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.88671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.88691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867448.88716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.88802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867448.90532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867448.90556: stderr chunk (state=3): >>><<< 25039 1726867448.90559: stdout chunk (state=3): >>><<< 25039 1726867448.90572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867448.90580: _low_level_execute_command(): starting 25039 1726867448.90583: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/AnsiballZ_command.py && sleep 0' 25039 1726867448.91013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867448.91016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.91018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867448.91021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867448.91023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867448.91063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867448.91066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867448.91118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.06489: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 17:24:09.059382", "end": "2024-09-20 17:24:09.063001", "delta": "0:00:00.003619", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867449.07994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867449.08023: stderr chunk (state=3): >>><<< 25039 1726867449.08027: stdout chunk (state=3): >>><<< 25039 1726867449.08040: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 17:24:09.059382", "end": "2024-09-20 17:24:09.063001", "delta": "0:00:00.003619", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867449.08064: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867449.08069: _low_level_execute_command(): starting 25039 1726867449.08074: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867448.822803-25341-41288478035866/ > /dev/null 2>&1 && sleep 0' 25039 1726867449.08465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.08469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867449.08497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.08500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.08503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.08553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.08557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.08610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.10419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.10437: stderr chunk (state=3): >>><<< 25039 1726867449.10440: stdout chunk (state=3): >>><<< 25039 1726867449.10451: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.10457: handler run complete 25039 1726867449.10471: Evaluated conditional (False): False 25039 1726867449.10484: attempt loop complete, returning result 25039 1726867449.10499: variable 'item' from source: unknown 25039 1726867449.10561: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003619", "end": "2024-09-20 17:24:09.063001", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 17:24:09.059382" } 25039 1726867449.10682: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.10685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.10689: variable 'omit' from source: magic vars 25039 1726867449.10781: variable 'ansible_distribution_major_version' from source: facts 25039 1726867449.10784: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867449.10912: variable 'type' from source: play vars 25039 1726867449.10919: variable 'state' from source: include params 25039 1726867449.10922: variable 'interface' from source: play vars 25039 1726867449.10925: variable 'current_interfaces' from source: set_fact 25039 1726867449.10927: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25039 1726867449.10929: variable 'omit' from source: magic vars 25039 1726867449.10940: variable 'omit' from source: magic vars 25039 1726867449.10966: variable 'item' from source: unknown 25039 1726867449.11010: variable 'item' from source: unknown 25039 1726867449.11027: variable 'omit' from source: magic vars 25039 1726867449.11042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867449.11048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867449.11055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867449.11064: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867449.11067: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.11070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.11120: Set connection var ansible_shell_executable to /bin/sh 25039 1726867449.11123: Set connection var ansible_timeout to 10 25039 1726867449.11135: Set connection var ansible_shell_type to sh 25039 1726867449.11137: Set connection var ansible_connection to ssh 25039 1726867449.11140: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867449.11142: Set connection var ansible_pipelining to False 25039 1726867449.11158: variable 'ansible_shell_executable' from source: unknown 25039 1726867449.11160: variable 'ansible_connection' from source: unknown 25039 1726867449.11163: variable 'ansible_module_compression' from source: unknown 25039 1726867449.11165: variable 'ansible_shell_type' from source: unknown 25039 1726867449.11167: variable 'ansible_shell_executable' from source: unknown 25039 1726867449.11169: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.11174: variable 'ansible_pipelining' from source: unknown 25039 1726867449.11176: variable 'ansible_timeout' from source: unknown 25039 1726867449.11182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.11245: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867449.11253: variable 'omit' from source: magic vars 25039 1726867449.11256: starting attempt loop 25039 1726867449.11258: running the handler 25039 1726867449.11263: _low_level_execute_command(): starting 25039 1726867449.11268: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867449.11637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.11670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867449.11673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867449.11676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.11685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.11688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867449.11690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.11729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.11733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.11790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.13342: stdout chunk (state=3): >>>/root <<< 25039 1726867449.13439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.13459: stderr chunk (state=3): >>><<< 25039 1726867449.13462: stdout chunk (state=3): >>><<< 25039 1726867449.13473: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.13481: _low_level_execute_command(): starting 25039 1726867449.13486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224 `" && echo ansible-tmp-1726867449.1347218-25341-263904874246224="` echo /root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224 `" ) && sleep 0' 25039 1726867449.13879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867449.13883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867449.13885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867449.13887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.13889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.13935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.13945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.13989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.15828: stdout chunk (state=3): >>>ansible-tmp-1726867449.1347218-25341-263904874246224=/root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224 <<< 25039 1726867449.15936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.15955: stderr chunk (state=3): >>><<< 25039 1726867449.15958: stdout chunk (state=3): >>><<< 25039 1726867449.15969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867449.1347218-25341-263904874246224=/root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.15989: variable 'ansible_module_compression' from source: unknown 25039 1726867449.16017: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867449.16030: variable 'ansible_facts' from source: unknown 25039 1726867449.16074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/AnsiballZ_command.py 25039 1726867449.16154: Sending initial data 25039 1726867449.16158: Sent initial data (156 bytes) 25039 1726867449.16548: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867449.16581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867449.16584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867449.16587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.16589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.16591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.16641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.16648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.16693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.18211: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867449.18215: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867449.18266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867449.18310: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpftshfqzc /root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/AnsiballZ_command.py <<< 25039 1726867449.18316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/AnsiballZ_command.py" <<< 25039 1726867449.18354: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpftshfqzc" to remote "/root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/AnsiballZ_command.py" <<< 25039 1726867449.18363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/AnsiballZ_command.py" <<< 25039 1726867449.18898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.18932: stderr chunk (state=3): >>><<< 25039 1726867449.18936: stdout chunk (state=3): >>><<< 25039 1726867449.18948: done transferring module to remote 25039 1726867449.18954: _low_level_execute_command(): starting 25039 1726867449.18957: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/ /root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/AnsiballZ_command.py && sleep 0' 25039 1726867449.19337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.19340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.19344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.19347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.19395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.19399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.19446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.21151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.21171: stderr chunk (state=3): >>><<< 25039 1726867449.21175: stdout chunk (state=3): >>><<< 25039 1726867449.21194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.21197: _low_level_execute_command(): starting 25039 1726867449.21200: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/AnsiballZ_command.py && sleep 0' 25039 1726867449.21588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.21591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.21597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867449.21599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867449.21601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.21644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867449.21648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.21702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.37168: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 17:24:09.365544", "end": "2024-09-20 17:24:09.369309", "delta": "0:00:00.003765", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867449.38784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867449.38788: stdout chunk (state=3): >>><<< 25039 1726867449.38790: stderr chunk (state=3): >>><<< 25039 1726867449.38793: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 17:24:09.365544", "end": "2024-09-20 17:24:09.369309", "delta": "0:00:00.003765", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867449.38796: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867449.38799: _low_level_execute_command(): starting 25039 1726867449.38801: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867449.1347218-25341-263904874246224/ > /dev/null 2>&1 && sleep 0' 25039 1726867449.39388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867449.39396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867449.39416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.39430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867449.39442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867449.39449: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867449.39459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.39473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867449.39529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.39569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867449.39589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.39634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.39683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.41582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.41585: stderr chunk (state=3): >>><<< 25039 1726867449.41587: stdout chunk (state=3): >>><<< 25039 1726867449.41590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.41592: handler run complete 25039 1726867449.41594: Evaluated conditional (False): False 25039 1726867449.41596: attempt loop complete, returning result 25039 1726867449.41602: variable 'item' from source: unknown 25039 1726867449.41696: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.003765", "end": "2024-09-20 17:24:09.369309", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 17:24:09.365544" } 25039 1726867449.41819: dumping result to json 25039 1726867449.41822: done dumping result, returning 25039 1726867449.41825: done running TaskExecutor() for managed_node1/TASK: Create veth interface veth0 [0affcac9-a3a5-3ddc-7272-00000000015a] 25039 1726867449.41828: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015a 25039 1726867449.42006: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015a 25039 1726867449.42009: WORKER PROCESS EXITING 25039 1726867449.42117: no more pending results, returning what we have 25039 1726867449.42121: results queue empty 25039 1726867449.42122: checking for any_errors_fatal 25039 1726867449.42129: done checking for any_errors_fatal 25039 1726867449.42130: checking for max_fail_percentage 25039 1726867449.42131: done checking for max_fail_percentage 25039 1726867449.42132: checking to see if all hosts have failed and the running result is not ok 25039 1726867449.42133: done checking to see if all hosts have failed 25039 1726867449.42133: getting the remaining hosts for this loop 25039 1726867449.42135: done getting the remaining hosts for this loop 25039 1726867449.42138: getting the next task for host managed_node1 25039 1726867449.42144: done getting next task for host managed_node1 25039 1726867449.42147: ^ task is: TASK: Set up veth as managed by NetworkManager 25039 1726867449.42150: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867449.42154: getting variables 25039 1726867449.42155: in VariableManager get_vars() 25039 1726867449.42332: Calling all_inventory to load vars for managed_node1 25039 1726867449.42335: Calling groups_inventory to load vars for managed_node1 25039 1726867449.42338: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867449.42349: Calling all_plugins_play to load vars for managed_node1 25039 1726867449.42352: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867449.42355: Calling groups_plugins_play to load vars for managed_node1 25039 1726867449.42729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867449.42851: done with get_vars() 25039 1726867449.42861: done getting variables 25039 1726867449.42903: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 17:24:09 -0400 (0:00:01.007) 0:00:06.954 ****** 25039 1726867449.42927: entering _queue_task() for managed_node1/command 25039 1726867449.43128: worker is 1 (out of 1 available) 25039 1726867449.43142: exiting _queue_task() for managed_node1/command 25039 1726867449.43154: done queuing things up, now waiting for results queue to drain 25039 1726867449.43156: waiting for pending results... 25039 1726867449.43305: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 25039 1726867449.43362: in run() - task 0affcac9-a3a5-3ddc-7272-00000000015b 25039 1726867449.43372: variable 'ansible_search_path' from source: unknown 25039 1726867449.43379: variable 'ansible_search_path' from source: unknown 25039 1726867449.43406: calling self._execute() 25039 1726867449.43466: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.43469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.43479: variable 'omit' from source: magic vars 25039 1726867449.43732: variable 'ansible_distribution_major_version' from source: facts 25039 1726867449.43741: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867449.43845: variable 'type' from source: play vars 25039 1726867449.43848: variable 'state' from source: include params 25039 1726867449.43855: Evaluated conditional (type == 'veth' and state == 'present'): True 25039 1726867449.43862: variable 'omit' from source: magic vars 25039 1726867449.43888: variable 'omit' from source: magic vars 25039 1726867449.43955: variable 'interface' from source: play vars 25039 1726867449.43971: variable 'omit' from source: magic vars 25039 1726867449.44001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867449.44027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867449.44044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867449.44058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867449.44068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867449.44094: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867449.44097: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.44100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.44168: Set connection var ansible_shell_executable to /bin/sh 25039 1726867449.44173: Set connection var ansible_timeout to 10 25039 1726867449.44180: Set connection var ansible_shell_type to sh 25039 1726867449.44182: Set connection var ansible_connection to ssh 25039 1726867449.44191: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867449.44196: Set connection var ansible_pipelining to False 25039 1726867449.44215: variable 'ansible_shell_executable' from source: unknown 25039 1726867449.44218: variable 'ansible_connection' from source: unknown 25039 1726867449.44220: variable 'ansible_module_compression' from source: unknown 25039 1726867449.44222: variable 'ansible_shell_type' from source: unknown 25039 1726867449.44225: variable 'ansible_shell_executable' from source: unknown 25039 1726867449.44229: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.44232: variable 'ansible_pipelining' from source: unknown 25039 1726867449.44235: variable 'ansible_timeout' from source: unknown 25039 1726867449.44239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.44352: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867449.44362: variable 'omit' from source: magic vars 25039 1726867449.44482: starting attempt loop 25039 1726867449.44485: running the handler 25039 1726867449.44487: _low_level_execute_command(): starting 25039 1726867449.44490: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867449.45096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867449.45114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867449.45157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.45174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867449.45256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867449.45275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.45301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.45401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.47060: stdout chunk (state=3): >>>/root <<< 25039 1726867449.47151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.47180: stderr chunk (state=3): >>><<< 25039 1726867449.47184: stdout chunk (state=3): >>><<< 25039 1726867449.47203: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.47214: _low_level_execute_command(): starting 25039 1726867449.47220: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634 `" && echo ansible-tmp-1726867449.4720254-25381-240317079658634="` echo /root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634 `" ) && sleep 0' 25039 1726867449.47835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.47869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.47922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.49757: stdout chunk (state=3): >>>ansible-tmp-1726867449.4720254-25381-240317079658634=/root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634 <<< 25039 1726867449.49865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.49891: stderr chunk (state=3): >>><<< 25039 1726867449.49894: stdout chunk (state=3): >>><<< 25039 1726867449.49907: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867449.4720254-25381-240317079658634=/root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.49934: variable 'ansible_module_compression' from source: unknown 25039 1726867449.49970: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867449.50001: variable 'ansible_facts' from source: unknown 25039 1726867449.50097: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/AnsiballZ_command.py 25039 1726867449.50225: Sending initial data 25039 1726867449.50228: Sent initial data (156 bytes) 25039 1726867449.50807: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.50814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867449.50817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.50819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.50821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.50837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.50855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.50935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.52443: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867449.52450: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867449.52512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867449.52615: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp_1lka7hp /root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/AnsiballZ_command.py <<< 25039 1726867449.52621: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/AnsiballZ_command.py" <<< 25039 1726867449.52681: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp_1lka7hp" to remote "/root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/AnsiballZ_command.py" <<< 25039 1726867449.53239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.53270: stderr chunk (state=3): >>><<< 25039 1726867449.53274: stdout chunk (state=3): >>><<< 25039 1726867449.53315: done transferring module to remote 25039 1726867449.53324: _low_level_execute_command(): starting 25039 1726867449.53328: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/ /root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/AnsiballZ_command.py && sleep 0' 25039 1726867449.53713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.53717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.53734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.53783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867449.53788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.53839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.55780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.55784: stdout chunk (state=3): >>><<< 25039 1726867449.55789: stderr chunk (state=3): >>><<< 25039 1726867449.56064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.56068: _low_level_execute_command(): starting 25039 1726867449.56070: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/AnsiballZ_command.py && sleep 0' 25039 1726867449.57064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867449.57076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867449.57092: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867449.57162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.57235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.57383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.74149: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 17:24:09.722326", "end": "2024-09-20 17:24:09.739226", "delta": "0:00:00.016900", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 25039 1726867449.74204: stdout chunk (state=3): >>> <<< 25039 1726867449.75754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867449.75765: stdout chunk (state=3): >>><<< 25039 1726867449.75786: stderr chunk (state=3): >>><<< 25039 1726867449.75811: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 17:24:09.722326", "end": "2024-09-20 17:24:09.739226", "delta": "0:00:00.016900", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867449.75860: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867449.75878: _low_level_execute_command(): starting 25039 1726867449.75956: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867449.4720254-25381-240317079658634/ > /dev/null 2>&1 && sleep 0' 25039 1726867449.76522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867449.76535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867449.76550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.76566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867449.76590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867449.76602: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867449.76639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867449.76655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867449.76742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.76768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.76853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.78679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.78692: stdout chunk (state=3): >>><<< 25039 1726867449.78702: stderr chunk (state=3): >>><<< 25039 1726867449.78724: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.78734: handler run complete 25039 1726867449.78759: Evaluated conditional (False): False 25039 1726867449.78776: attempt loop complete, returning result 25039 1726867449.78786: _execute() done 25039 1726867449.78796: dumping result to json 25039 1726867449.78805: done dumping result, returning 25039 1726867449.78819: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0affcac9-a3a5-3ddc-7272-00000000015b] 25039 1726867449.78829: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015b ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.016900", "end": "2024-09-20 17:24:09.739226", "rc": 0, "start": "2024-09-20 17:24:09.722326" } 25039 1726867449.79119: no more pending results, returning what we have 25039 1726867449.79122: results queue empty 25039 1726867449.79123: checking for any_errors_fatal 25039 1726867449.79134: done checking for any_errors_fatal 25039 1726867449.79135: checking for max_fail_percentage 25039 1726867449.79136: done checking for max_fail_percentage 25039 1726867449.79137: checking to see if all hosts have failed and the running result is not ok 25039 1726867449.79139: done checking to see if all hosts have failed 25039 1726867449.79139: getting the remaining hosts for this loop 25039 1726867449.79141: done getting the remaining hosts for this loop 25039 1726867449.79144: getting the next task for host managed_node1 25039 1726867449.79151: done getting next task for host managed_node1 25039 1726867449.79154: ^ task is: TASK: Delete veth interface {{ interface }} 25039 1726867449.79157: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867449.79161: getting variables 25039 1726867449.79163: in VariableManager get_vars() 25039 1726867449.79403: Calling all_inventory to load vars for managed_node1 25039 1726867449.79405: Calling groups_inventory to load vars for managed_node1 25039 1726867449.79410: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867449.79423: Calling all_plugins_play to load vars for managed_node1 25039 1726867449.79426: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867449.79429: Calling groups_plugins_play to load vars for managed_node1 25039 1726867449.79595: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015b 25039 1726867449.79599: WORKER PROCESS EXITING 25039 1726867449.79623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867449.79838: done with get_vars() 25039 1726867449.79853: done getting variables 25039 1726867449.79911: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867449.80029: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 17:24:09 -0400 (0:00:00.371) 0:00:07.326 ****** 25039 1726867449.80058: entering _queue_task() for managed_node1/command 25039 1726867449.80495: worker is 1 (out of 1 available) 25039 1726867449.80503: exiting _queue_task() for managed_node1/command 25039 1726867449.80515: done queuing things up, now waiting for results queue to drain 25039 1726867449.80516: waiting for pending results... 25039 1726867449.80573: running TaskExecutor() for managed_node1/TASK: Delete veth interface veth0 25039 1726867449.80673: in run() - task 0affcac9-a3a5-3ddc-7272-00000000015c 25039 1726867449.80696: variable 'ansible_search_path' from source: unknown 25039 1726867449.80705: variable 'ansible_search_path' from source: unknown 25039 1726867449.80754: calling self._execute() 25039 1726867449.80844: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.80963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.80966: variable 'omit' from source: magic vars 25039 1726867449.81243: variable 'ansible_distribution_major_version' from source: facts 25039 1726867449.81263: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867449.81491: variable 'type' from source: play vars 25039 1726867449.81514: variable 'state' from source: include params 25039 1726867449.81523: variable 'interface' from source: play vars 25039 1726867449.81532: variable 'current_interfaces' from source: set_fact 25039 1726867449.81547: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 25039 1726867449.81554: when evaluation is False, skipping this task 25039 1726867449.81561: _execute() done 25039 1726867449.81568: dumping result to json 25039 1726867449.81576: done dumping result, returning 25039 1726867449.81589: done running TaskExecutor() for managed_node1/TASK: Delete veth interface veth0 [0affcac9-a3a5-3ddc-7272-00000000015c] 25039 1726867449.81600: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015c skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867449.81884: no more pending results, returning what we have 25039 1726867449.81887: results queue empty 25039 1726867449.81888: checking for any_errors_fatal 25039 1726867449.81899: done checking for any_errors_fatal 25039 1726867449.81900: checking for max_fail_percentage 25039 1726867449.81902: done checking for max_fail_percentage 25039 1726867449.81903: checking to see if all hosts have failed and the running result is not ok 25039 1726867449.81904: done checking to see if all hosts have failed 25039 1726867449.81905: getting the remaining hosts for this loop 25039 1726867449.81906: done getting the remaining hosts for this loop 25039 1726867449.81912: getting the next task for host managed_node1 25039 1726867449.81919: done getting next task for host managed_node1 25039 1726867449.81921: ^ task is: TASK: Create dummy interface {{ interface }} 25039 1726867449.81924: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867449.81928: getting variables 25039 1726867449.81930: in VariableManager get_vars() 25039 1726867449.81971: Calling all_inventory to load vars for managed_node1 25039 1726867449.81974: Calling groups_inventory to load vars for managed_node1 25039 1726867449.81978: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867449.81992: Calling all_plugins_play to load vars for managed_node1 25039 1726867449.81995: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867449.81998: Calling groups_plugins_play to load vars for managed_node1 25039 1726867449.82249: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015c 25039 1726867449.82252: WORKER PROCESS EXITING 25039 1726867449.82275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867449.82523: done with get_vars() 25039 1726867449.82533: done getting variables 25039 1726867449.82595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867449.82709: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 17:24:09 -0400 (0:00:00.026) 0:00:07.353 ****** 25039 1726867449.82741: entering _queue_task() for managed_node1/command 25039 1726867449.83074: worker is 1 (out of 1 available) 25039 1726867449.83086: exiting _queue_task() for managed_node1/command 25039 1726867449.83096: done queuing things up, now waiting for results queue to drain 25039 1726867449.83098: waiting for pending results... 25039 1726867449.83239: running TaskExecutor() for managed_node1/TASK: Create dummy interface veth0 25039 1726867449.83346: in run() - task 0affcac9-a3a5-3ddc-7272-00000000015d 25039 1726867449.83364: variable 'ansible_search_path' from source: unknown 25039 1726867449.83372: variable 'ansible_search_path' from source: unknown 25039 1726867449.83416: calling self._execute() 25039 1726867449.83502: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.83518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.83531: variable 'omit' from source: magic vars 25039 1726867449.83888: variable 'ansible_distribution_major_version' from source: facts 25039 1726867449.83904: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867449.84144: variable 'type' from source: play vars 25039 1726867449.84147: variable 'state' from source: include params 25039 1726867449.84150: variable 'interface' from source: play vars 25039 1726867449.84152: variable 'current_interfaces' from source: set_fact 25039 1726867449.84154: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 25039 1726867449.84162: when evaluation is False, skipping this task 25039 1726867449.84191: _execute() done 25039 1726867449.84194: dumping result to json 25039 1726867449.84196: done dumping result, returning 25039 1726867449.84199: done running TaskExecutor() for managed_node1/TASK: Create dummy interface veth0 [0affcac9-a3a5-3ddc-7272-00000000015d] 25039 1726867449.84201: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015d 25039 1726867449.84403: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015d 25039 1726867449.84409: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867449.84450: no more pending results, returning what we have 25039 1726867449.84454: results queue empty 25039 1726867449.84455: checking for any_errors_fatal 25039 1726867449.84459: done checking for any_errors_fatal 25039 1726867449.84460: checking for max_fail_percentage 25039 1726867449.84461: done checking for max_fail_percentage 25039 1726867449.84462: checking to see if all hosts have failed and the running result is not ok 25039 1726867449.84467: done checking to see if all hosts have failed 25039 1726867449.84468: getting the remaining hosts for this loop 25039 1726867449.84470: done getting the remaining hosts for this loop 25039 1726867449.84474: getting the next task for host managed_node1 25039 1726867449.84480: done getting next task for host managed_node1 25039 1726867449.84482: ^ task is: TASK: Delete dummy interface {{ interface }} 25039 1726867449.84485: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867449.84488: getting variables 25039 1726867449.84490: in VariableManager get_vars() 25039 1726867449.84527: Calling all_inventory to load vars for managed_node1 25039 1726867449.84529: Calling groups_inventory to load vars for managed_node1 25039 1726867449.84532: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867449.84543: Calling all_plugins_play to load vars for managed_node1 25039 1726867449.84546: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867449.84549: Calling groups_plugins_play to load vars for managed_node1 25039 1726867449.84811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867449.85022: done with get_vars() 25039 1726867449.85031: done getting variables 25039 1726867449.85094: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867449.85204: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 17:24:09 -0400 (0:00:00.024) 0:00:07.377 ****** 25039 1726867449.85239: entering _queue_task() for managed_node1/command 25039 1726867449.85563: worker is 1 (out of 1 available) 25039 1726867449.85573: exiting _queue_task() for managed_node1/command 25039 1726867449.85584: done queuing things up, now waiting for results queue to drain 25039 1726867449.85586: waiting for pending results... 25039 1726867449.85784: running TaskExecutor() for managed_node1/TASK: Delete dummy interface veth0 25039 1726867449.85830: in run() - task 0affcac9-a3a5-3ddc-7272-00000000015e 25039 1726867449.85849: variable 'ansible_search_path' from source: unknown 25039 1726867449.85858: variable 'ansible_search_path' from source: unknown 25039 1726867449.85899: calling self._execute() 25039 1726867449.85984: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.86030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.86034: variable 'omit' from source: magic vars 25039 1726867449.86393: variable 'ansible_distribution_major_version' from source: facts 25039 1726867449.86413: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867449.86641: variable 'type' from source: play vars 25039 1726867449.86654: variable 'state' from source: include params 25039 1726867449.86684: variable 'interface' from source: play vars 25039 1726867449.86687: variable 'current_interfaces' from source: set_fact 25039 1726867449.86691: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 25039 1726867449.86694: when evaluation is False, skipping this task 25039 1726867449.86699: _execute() done 25039 1726867449.86751: dumping result to json 25039 1726867449.86754: done dumping result, returning 25039 1726867449.86757: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface veth0 [0affcac9-a3a5-3ddc-7272-00000000015e] 25039 1726867449.86759: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015e skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867449.87017: no more pending results, returning what we have 25039 1726867449.87020: results queue empty 25039 1726867449.87021: checking for any_errors_fatal 25039 1726867449.87026: done checking for any_errors_fatal 25039 1726867449.87027: checking for max_fail_percentage 25039 1726867449.87029: done checking for max_fail_percentage 25039 1726867449.87029: checking to see if all hosts have failed and the running result is not ok 25039 1726867449.87030: done checking to see if all hosts have failed 25039 1726867449.87031: getting the remaining hosts for this loop 25039 1726867449.87033: done getting the remaining hosts for this loop 25039 1726867449.87036: getting the next task for host managed_node1 25039 1726867449.87042: done getting next task for host managed_node1 25039 1726867449.87044: ^ task is: TASK: Create tap interface {{ interface }} 25039 1726867449.87047: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867449.87051: getting variables 25039 1726867449.87053: in VariableManager get_vars() 25039 1726867449.87096: Calling all_inventory to load vars for managed_node1 25039 1726867449.87098: Calling groups_inventory to load vars for managed_node1 25039 1726867449.87101: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867449.87116: Calling all_plugins_play to load vars for managed_node1 25039 1726867449.87119: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867449.87123: Calling groups_plugins_play to load vars for managed_node1 25039 1726867449.87426: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015e 25039 1726867449.87429: WORKER PROCESS EXITING 25039 1726867449.87451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867449.87653: done with get_vars() 25039 1726867449.87663: done getting variables 25039 1726867449.87721: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867449.87830: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 17:24:09 -0400 (0:00:00.026) 0:00:07.404 ****** 25039 1726867449.87863: entering _queue_task() for managed_node1/command 25039 1726867449.88125: worker is 1 (out of 1 available) 25039 1726867449.88138: exiting _queue_task() for managed_node1/command 25039 1726867449.88150: done queuing things up, now waiting for results queue to drain 25039 1726867449.88151: waiting for pending results... 25039 1726867449.88382: running TaskExecutor() for managed_node1/TASK: Create tap interface veth0 25039 1726867449.88481: in run() - task 0affcac9-a3a5-3ddc-7272-00000000015f 25039 1726867449.88504: variable 'ansible_search_path' from source: unknown 25039 1726867449.88513: variable 'ansible_search_path' from source: unknown 25039 1726867449.88546: calling self._execute() 25039 1726867449.88631: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.88641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.88651: variable 'omit' from source: magic vars 25039 1726867449.88973: variable 'ansible_distribution_major_version' from source: facts 25039 1726867449.88990: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867449.89199: variable 'type' from source: play vars 25039 1726867449.89213: variable 'state' from source: include params 25039 1726867449.89262: variable 'interface' from source: play vars 25039 1726867449.89265: variable 'current_interfaces' from source: set_fact 25039 1726867449.89268: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 25039 1726867449.89271: when evaluation is False, skipping this task 25039 1726867449.89273: _execute() done 25039 1726867449.89275: dumping result to json 25039 1726867449.89280: done dumping result, returning 25039 1726867449.89283: done running TaskExecutor() for managed_node1/TASK: Create tap interface veth0 [0affcac9-a3a5-3ddc-7272-00000000015f] 25039 1726867449.89285: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015f 25039 1726867449.89491: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000015f 25039 1726867449.89494: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867449.89539: no more pending results, returning what we have 25039 1726867449.89542: results queue empty 25039 1726867449.89543: checking for any_errors_fatal 25039 1726867449.89549: done checking for any_errors_fatal 25039 1726867449.89550: checking for max_fail_percentage 25039 1726867449.89551: done checking for max_fail_percentage 25039 1726867449.89552: checking to see if all hosts have failed and the running result is not ok 25039 1726867449.89553: done checking to see if all hosts have failed 25039 1726867449.89554: getting the remaining hosts for this loop 25039 1726867449.89555: done getting the remaining hosts for this loop 25039 1726867449.89558: getting the next task for host managed_node1 25039 1726867449.89563: done getting next task for host managed_node1 25039 1726867449.89565: ^ task is: TASK: Delete tap interface {{ interface }} 25039 1726867449.89568: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867449.89572: getting variables 25039 1726867449.89573: in VariableManager get_vars() 25039 1726867449.89618: Calling all_inventory to load vars for managed_node1 25039 1726867449.89622: Calling groups_inventory to load vars for managed_node1 25039 1726867449.89624: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867449.89635: Calling all_plugins_play to load vars for managed_node1 25039 1726867449.89639: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867449.89642: Calling groups_plugins_play to load vars for managed_node1 25039 1726867449.89898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867449.90099: done with get_vars() 25039 1726867449.90111: done getting variables 25039 1726867449.90165: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867449.90268: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 17:24:09 -0400 (0:00:00.024) 0:00:07.428 ****** 25039 1726867449.90297: entering _queue_task() for managed_node1/command 25039 1726867449.90581: worker is 1 (out of 1 available) 25039 1726867449.90592: exiting _queue_task() for managed_node1/command 25039 1726867449.90601: done queuing things up, now waiting for results queue to drain 25039 1726867449.90603: waiting for pending results... 25039 1726867449.90863: running TaskExecutor() for managed_node1/TASK: Delete tap interface veth0 25039 1726867449.90887: in run() - task 0affcac9-a3a5-3ddc-7272-000000000160 25039 1726867449.90905: variable 'ansible_search_path' from source: unknown 25039 1726867449.90917: variable 'ansible_search_path' from source: unknown 25039 1726867449.90959: calling self._execute() 25039 1726867449.91043: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.91067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.91180: variable 'omit' from source: magic vars 25039 1726867449.91416: variable 'ansible_distribution_major_version' from source: facts 25039 1726867449.91433: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867449.91641: variable 'type' from source: play vars 25039 1726867449.91653: variable 'state' from source: include params 25039 1726867449.91662: variable 'interface' from source: play vars 25039 1726867449.91671: variable 'current_interfaces' from source: set_fact 25039 1726867449.91689: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 25039 1726867449.91696: when evaluation is False, skipping this task 25039 1726867449.91704: _execute() done 25039 1726867449.91719: dumping result to json 25039 1726867449.91732: done dumping result, returning 25039 1726867449.91743: done running TaskExecutor() for managed_node1/TASK: Delete tap interface veth0 [0affcac9-a3a5-3ddc-7272-000000000160] 25039 1726867449.91825: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000160 25039 1726867449.91887: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000160 25039 1726867449.91890: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867449.91941: no more pending results, returning what we have 25039 1726867449.91945: results queue empty 25039 1726867449.91946: checking for any_errors_fatal 25039 1726867449.91953: done checking for any_errors_fatal 25039 1726867449.91953: checking for max_fail_percentage 25039 1726867449.91955: done checking for max_fail_percentage 25039 1726867449.91956: checking to see if all hosts have failed and the running result is not ok 25039 1726867449.91957: done checking to see if all hosts have failed 25039 1726867449.91958: getting the remaining hosts for this loop 25039 1726867449.91959: done getting the remaining hosts for this loop 25039 1726867449.91963: getting the next task for host managed_node1 25039 1726867449.91970: done getting next task for host managed_node1 25039 1726867449.91973: ^ task is: TASK: Set up gateway ip on veth peer 25039 1726867449.91976: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867449.91983: getting variables 25039 1726867449.91985: in VariableManager get_vars() 25039 1726867449.92027: Calling all_inventory to load vars for managed_node1 25039 1726867449.92030: Calling groups_inventory to load vars for managed_node1 25039 1726867449.92032: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867449.92045: Calling all_plugins_play to load vars for managed_node1 25039 1726867449.92048: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867449.92052: Calling groups_plugins_play to load vars for managed_node1 25039 1726867449.92364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867449.92583: done with get_vars() 25039 1726867449.92592: done getting variables 25039 1726867449.92675: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Friday 20 September 2024 17:24:09 -0400 (0:00:00.024) 0:00:07.452 ****** 25039 1726867449.92700: entering _queue_task() for managed_node1/shell 25039 1726867449.92702: Creating lock for shell 25039 1726867449.92919: worker is 1 (out of 1 available) 25039 1726867449.92932: exiting _queue_task() for managed_node1/shell 25039 1726867449.93057: done queuing things up, now waiting for results queue to drain 25039 1726867449.93059: waiting for pending results... 25039 1726867449.93290: running TaskExecutor() for managed_node1/TASK: Set up gateway ip on veth peer 25039 1726867449.93295: in run() - task 0affcac9-a3a5-3ddc-7272-00000000000d 25039 1726867449.93298: variable 'ansible_search_path' from source: unknown 25039 1726867449.93386: calling self._execute() 25039 1726867449.93420: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.93431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.93444: variable 'omit' from source: magic vars 25039 1726867449.93786: variable 'ansible_distribution_major_version' from source: facts 25039 1726867449.93804: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867449.93848: variable 'omit' from source: magic vars 25039 1726867449.93854: variable 'omit' from source: magic vars 25039 1726867449.93995: variable 'interface' from source: play vars 25039 1726867449.94019: variable 'omit' from source: magic vars 25039 1726867449.94068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867449.94145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867449.94148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867449.94151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867449.94168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867449.94203: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867449.94215: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.94222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.94329: Set connection var ansible_shell_executable to /bin/sh 25039 1726867449.94361: Set connection var ansible_timeout to 10 25039 1726867449.94365: Set connection var ansible_shell_type to sh 25039 1726867449.94367: Set connection var ansible_connection to ssh 25039 1726867449.94369: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867449.94374: Set connection var ansible_pipelining to False 25039 1726867449.94471: variable 'ansible_shell_executable' from source: unknown 25039 1726867449.94474: variable 'ansible_connection' from source: unknown 25039 1726867449.94476: variable 'ansible_module_compression' from source: unknown 25039 1726867449.94480: variable 'ansible_shell_type' from source: unknown 25039 1726867449.94482: variable 'ansible_shell_executable' from source: unknown 25039 1726867449.94484: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867449.94486: variable 'ansible_pipelining' from source: unknown 25039 1726867449.94488: variable 'ansible_timeout' from source: unknown 25039 1726867449.94490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867449.94590: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867449.94612: variable 'omit' from source: magic vars 25039 1726867449.94621: starting attempt loop 25039 1726867449.94627: running the handler 25039 1726867449.94639: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867449.94661: _low_level_execute_command(): starting 25039 1726867449.94672: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867449.95424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867449.95554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867449.95558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.95583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.95679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867449.97347: stdout chunk (state=3): >>>/root <<< 25039 1726867449.97516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867449.97520: stdout chunk (state=3): >>><<< 25039 1726867449.97522: stderr chunk (state=3): >>><<< 25039 1726867449.97550: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867449.97642: _low_level_execute_command(): starting 25039 1726867449.97646: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411 `" && echo ansible-tmp-1726867449.9755683-25411-39351970879411="` echo /root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411 `" ) && sleep 0' 25039 1726867449.98206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867449.98247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867449.98292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867450.00168: stdout chunk (state=3): >>>ansible-tmp-1726867449.9755683-25411-39351970879411=/root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411 <<< 25039 1726867450.00285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867450.00337: stderr chunk (state=3): >>><<< 25039 1726867450.00349: stdout chunk (state=3): >>><<< 25039 1726867450.00380: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867449.9755683-25411-39351970879411=/root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867450.00483: variable 'ansible_module_compression' from source: unknown 25039 1726867450.00487: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867450.00518: variable 'ansible_facts' from source: unknown 25039 1726867450.00614: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/AnsiballZ_command.py 25039 1726867450.00868: Sending initial data 25039 1726867450.00871: Sent initial data (155 bytes) 25039 1726867450.02053: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867450.02319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867450.02359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867450.03933: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867450.03937: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867450.03982: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867450.04101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp7w9wpt_0 /root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/AnsiballZ_command.py <<< 25039 1726867450.04108: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/AnsiballZ_command.py" <<< 25039 1726867450.04146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp7w9wpt_0" to remote "/root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/AnsiballZ_command.py" <<< 25039 1726867450.04149: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/AnsiballZ_command.py" <<< 25039 1726867450.05249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867450.05281: stderr chunk (state=3): >>><<< 25039 1726867450.05284: stdout chunk (state=3): >>><<< 25039 1726867450.05344: done transferring module to remote 25039 1726867450.05354: _low_level_execute_command(): starting 25039 1726867450.05359: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/ /root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/AnsiballZ_command.py && sleep 0' 25039 1726867450.06082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867450.06085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867450.06088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867450.06091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867450.06094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867450.06102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867450.06118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867450.06190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867450.08280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867450.08283: stdout chunk (state=3): >>><<< 25039 1726867450.08286: stderr chunk (state=3): >>><<< 25039 1726867450.08288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867450.08295: _low_level_execute_command(): starting 25039 1726867450.08297: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/AnsiballZ_command.py && sleep 0' 25039 1726867450.09396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867450.09400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867450.09402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867450.09405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867450.09582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867450.09586: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867450.09588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867450.09590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867450.09592: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867450.09595: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867450.09598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867450.09600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867450.09803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867450.09875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867450.27013: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 17:24:10.248357", "end": "2024-09-20 17:24:10.267935", "delta": "0:00:00.019578", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867450.28666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867450.28670: stdout chunk (state=3): >>><<< 25039 1726867450.28673: stderr chunk (state=3): >>><<< 25039 1726867450.28695: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 17:24:10.248357", "end": "2024-09-20 17:24:10.267935", "delta": "0:00:00.019578", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867450.28739: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867450.29062: _low_level_execute_command(): starting 25039 1726867450.29066: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867449.9755683-25411-39351970879411/ > /dev/null 2>&1 && sleep 0' 25039 1726867450.30373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867450.30379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867450.30382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867450.30384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867450.30390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867450.30436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867450.30503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867450.30576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867450.32389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867450.32421: stderr chunk (state=3): >>><<< 25039 1726867450.32431: stdout chunk (state=3): >>><<< 25039 1726867450.32467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867450.32671: handler run complete 25039 1726867450.32674: Evaluated conditional (False): False 25039 1726867450.32679: attempt loop complete, returning result 25039 1726867450.32682: _execute() done 25039 1726867450.32684: dumping result to json 25039 1726867450.32686: done dumping result, returning 25039 1726867450.32688: done running TaskExecutor() for managed_node1/TASK: Set up gateway ip on veth peer [0affcac9-a3a5-3ddc-7272-00000000000d] 25039 1726867450.32691: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000000d 25039 1726867450.32762: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000000d 25039 1726867450.32765: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.019578", "end": "2024-09-20 17:24:10.267935", "rc": 0, "start": "2024-09-20 17:24:10.248357" } 25039 1726867450.32840: no more pending results, returning what we have 25039 1726867450.32844: results queue empty 25039 1726867450.32845: checking for any_errors_fatal 25039 1726867450.32850: done checking for any_errors_fatal 25039 1726867450.32851: checking for max_fail_percentage 25039 1726867450.32854: done checking for max_fail_percentage 25039 1726867450.32855: checking to see if all hosts have failed and the running result is not ok 25039 1726867450.32856: done checking to see if all hosts have failed 25039 1726867450.32856: getting the remaining hosts for this loop 25039 1726867450.32858: done getting the remaining hosts for this loop 25039 1726867450.32861: getting the next task for host managed_node1 25039 1726867450.32869: done getting next task for host managed_node1 25039 1726867450.32872: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 25039 1726867450.32874: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867450.32880: getting variables 25039 1726867450.32882: in VariableManager get_vars() 25039 1726867450.32925: Calling all_inventory to load vars for managed_node1 25039 1726867450.32929: Calling groups_inventory to load vars for managed_node1 25039 1726867450.32932: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867450.32944: Calling all_plugins_play to load vars for managed_node1 25039 1726867450.32947: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867450.32951: Calling groups_plugins_play to load vars for managed_node1 25039 1726867450.33341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867450.33737: done with get_vars() 25039 1726867450.33748: done getting variables 25039 1726867450.34015: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Friday 20 September 2024 17:24:10 -0400 (0:00:00.413) 0:00:07.866 ****** 25039 1726867450.34044: entering _queue_task() for managed_node1/debug 25039 1726867450.34497: worker is 1 (out of 1 available) 25039 1726867450.34510: exiting _queue_task() for managed_node1/debug 25039 1726867450.34522: done queuing things up, now waiting for results queue to drain 25039 1726867450.34523: waiting for pending results... 25039 1726867450.35197: running TaskExecutor() for managed_node1/TASK: TEST: I can configure an interface with static ipv6 config 25039 1726867450.35202: in run() - task 0affcac9-a3a5-3ddc-7272-00000000000f 25039 1726867450.35206: variable 'ansible_search_path' from source: unknown 25039 1726867450.35241: calling self._execute() 25039 1726867450.35341: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.35347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.35358: variable 'omit' from source: magic vars 25039 1726867450.36195: variable 'ansible_distribution_major_version' from source: facts 25039 1726867450.36216: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867450.36288: variable 'omit' from source: magic vars 25039 1726867450.36319: variable 'omit' from source: magic vars 25039 1726867450.36358: variable 'omit' from source: magic vars 25039 1726867450.36518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867450.36558: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867450.36820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867450.36825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867450.36828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867450.36831: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867450.36835: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.36837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.37000: Set connection var ansible_shell_executable to /bin/sh 25039 1726867450.37087: Set connection var ansible_timeout to 10 25039 1726867450.37128: Set connection var ansible_shell_type to sh 25039 1726867450.37136: Set connection var ansible_connection to ssh 25039 1726867450.37362: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867450.37366: Set connection var ansible_pipelining to False 25039 1726867450.37388: variable 'ansible_shell_executable' from source: unknown 25039 1726867450.37481: variable 'ansible_connection' from source: unknown 25039 1726867450.37491: variable 'ansible_module_compression' from source: unknown 25039 1726867450.37495: variable 'ansible_shell_type' from source: unknown 25039 1726867450.37497: variable 'ansible_shell_executable' from source: unknown 25039 1726867450.37500: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.37504: variable 'ansible_pipelining' from source: unknown 25039 1726867450.37515: variable 'ansible_timeout' from source: unknown 25039 1726867450.37524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.37772: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867450.37983: variable 'omit' from source: magic vars 25039 1726867450.37987: starting attempt loop 25039 1726867450.37989: running the handler 25039 1726867450.37991: handler run complete 25039 1726867450.38005: attempt loop complete, returning result 25039 1726867450.38019: _execute() done 25039 1726867450.38026: dumping result to json 25039 1726867450.38035: done dumping result, returning 25039 1726867450.38042: done running TaskExecutor() for managed_node1/TASK: TEST: I can configure an interface with static ipv6 config [0affcac9-a3a5-3ddc-7272-00000000000f] 25039 1726867450.38184: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000000f 25039 1726867450.38258: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000000f 25039 1726867450.38262: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 25039 1726867450.38310: no more pending results, returning what we have 25039 1726867450.38313: results queue empty 25039 1726867450.38314: checking for any_errors_fatal 25039 1726867450.38320: done checking for any_errors_fatal 25039 1726867450.38321: checking for max_fail_percentage 25039 1726867450.38322: done checking for max_fail_percentage 25039 1726867450.38323: checking to see if all hosts have failed and the running result is not ok 25039 1726867450.38324: done checking to see if all hosts have failed 25039 1726867450.38325: getting the remaining hosts for this loop 25039 1726867450.38326: done getting the remaining hosts for this loop 25039 1726867450.38331: getting the next task for host managed_node1 25039 1726867450.38339: done getting next task for host managed_node1 25039 1726867450.38344: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25039 1726867450.38348: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867450.38363: getting variables 25039 1726867450.38365: in VariableManager get_vars() 25039 1726867450.38408: Calling all_inventory to load vars for managed_node1 25039 1726867450.38411: Calling groups_inventory to load vars for managed_node1 25039 1726867450.38413: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867450.38424: Calling all_plugins_play to load vars for managed_node1 25039 1726867450.38427: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867450.38429: Calling groups_plugins_play to load vars for managed_node1 25039 1726867450.39038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867450.39406: done with get_vars() 25039 1726867450.39416: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:24:10 -0400 (0:00:00.055) 0:00:07.921 ****** 25039 1726867450.39624: entering _queue_task() for managed_node1/include_tasks 25039 1726867450.40147: worker is 1 (out of 1 available) 25039 1726867450.40160: exiting _queue_task() for managed_node1/include_tasks 25039 1726867450.40171: done queuing things up, now waiting for results queue to drain 25039 1726867450.40172: waiting for pending results... 25039 1726867450.40794: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25039 1726867450.40800: in run() - task 0affcac9-a3a5-3ddc-7272-000000000017 25039 1726867450.40803: variable 'ansible_search_path' from source: unknown 25039 1726867450.40806: variable 'ansible_search_path' from source: unknown 25039 1726867450.41182: calling self._execute() 25039 1726867450.41185: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.41189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.41192: variable 'omit' from source: magic vars 25039 1726867450.41812: variable 'ansible_distribution_major_version' from source: facts 25039 1726867450.41829: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867450.41839: _execute() done 25039 1726867450.41849: dumping result to json 25039 1726867450.41857: done dumping result, returning 25039 1726867450.41868: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-3ddc-7272-000000000017] 25039 1726867450.41880: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000017 25039 1726867450.42015: no more pending results, returning what we have 25039 1726867450.42020: in VariableManager get_vars() 25039 1726867450.42071: Calling all_inventory to load vars for managed_node1 25039 1726867450.42074: Calling groups_inventory to load vars for managed_node1 25039 1726867450.42078: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867450.42091: Calling all_plugins_play to load vars for managed_node1 25039 1726867450.42094: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867450.42097: Calling groups_plugins_play to load vars for managed_node1 25039 1726867450.42378: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000017 25039 1726867450.42382: WORKER PROCESS EXITING 25039 1726867450.42498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867450.42893: done with get_vars() 25039 1726867450.42901: variable 'ansible_search_path' from source: unknown 25039 1726867450.42902: variable 'ansible_search_path' from source: unknown 25039 1726867450.43003: we have included files to process 25039 1726867450.43004: generating all_blocks data 25039 1726867450.43006: done generating all_blocks data 25039 1726867450.43011: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25039 1726867450.43012: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25039 1726867450.43014: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25039 1726867450.45212: done processing included file 25039 1726867450.45214: iterating over new_blocks loaded from include file 25039 1726867450.45216: in VariableManager get_vars() 25039 1726867450.45244: done with get_vars() 25039 1726867450.45246: filtering new block on tags 25039 1726867450.45266: done filtering new block on tags 25039 1726867450.45270: in VariableManager get_vars() 25039 1726867450.45396: done with get_vars() 25039 1726867450.45398: filtering new block on tags 25039 1726867450.45420: done filtering new block on tags 25039 1726867450.45423: in VariableManager get_vars() 25039 1726867450.45447: done with get_vars() 25039 1726867450.45449: filtering new block on tags 25039 1726867450.45467: done filtering new block on tags 25039 1726867450.45469: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 25039 1726867450.45475: extending task lists for all hosts with included blocks 25039 1726867450.47076: done extending task lists 25039 1726867450.47079: done processing included files 25039 1726867450.47080: results queue empty 25039 1726867450.47081: checking for any_errors_fatal 25039 1726867450.47085: done checking for any_errors_fatal 25039 1726867450.47085: checking for max_fail_percentage 25039 1726867450.47087: done checking for max_fail_percentage 25039 1726867450.47087: checking to see if all hosts have failed and the running result is not ok 25039 1726867450.47088: done checking to see if all hosts have failed 25039 1726867450.47089: getting the remaining hosts for this loop 25039 1726867450.47090: done getting the remaining hosts for this loop 25039 1726867450.47093: getting the next task for host managed_node1 25039 1726867450.47097: done getting next task for host managed_node1 25039 1726867450.47099: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25039 1726867450.47103: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867450.47112: getting variables 25039 1726867450.47113: in VariableManager get_vars() 25039 1726867450.47128: Calling all_inventory to load vars for managed_node1 25039 1726867450.47130: Calling groups_inventory to load vars for managed_node1 25039 1726867450.47132: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867450.47137: Calling all_plugins_play to load vars for managed_node1 25039 1726867450.47140: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867450.47142: Calling groups_plugins_play to load vars for managed_node1 25039 1726867450.47911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867450.48311: done with get_vars() 25039 1726867450.48320: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:24:10 -0400 (0:00:00.089) 0:00:08.011 ****** 25039 1726867450.48591: entering _queue_task() for managed_node1/setup 25039 1726867450.49297: worker is 1 (out of 1 available) 25039 1726867450.49308: exiting _queue_task() for managed_node1/setup 25039 1726867450.49318: done queuing things up, now waiting for results queue to drain 25039 1726867450.49319: waiting for pending results... 25039 1726867450.49893: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25039 1726867450.49900: in run() - task 0affcac9-a3a5-3ddc-7272-0000000001fc 25039 1726867450.49903: variable 'ansible_search_path' from source: unknown 25039 1726867450.49906: variable 'ansible_search_path' from source: unknown 25039 1726867450.50107: calling self._execute() 25039 1726867450.50197: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.50212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.50224: variable 'omit' from source: magic vars 25039 1726867450.50795: variable 'ansible_distribution_major_version' from source: facts 25039 1726867450.50821: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867450.51055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867450.53884: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867450.54175: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867450.54226: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867450.54266: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867450.54299: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867450.54563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867450.54783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867450.54787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867450.54790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867450.54792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867450.54795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867450.54798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867450.54801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867450.55023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867450.55041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867450.55436: variable '__network_required_facts' from source: role '' defaults 25039 1726867450.55449: variable 'ansible_facts' from source: unknown 25039 1726867450.55543: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25039 1726867450.55982: when evaluation is False, skipping this task 25039 1726867450.55986: _execute() done 25039 1726867450.55988: dumping result to json 25039 1726867450.55991: done dumping result, returning 25039 1726867450.55994: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-3ddc-7272-0000000001fc] 25039 1726867450.55996: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001fc 25039 1726867450.56071: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001fc 25039 1726867450.56075: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867450.56123: no more pending results, returning what we have 25039 1726867450.56126: results queue empty 25039 1726867450.56127: checking for any_errors_fatal 25039 1726867450.56129: done checking for any_errors_fatal 25039 1726867450.56129: checking for max_fail_percentage 25039 1726867450.56131: done checking for max_fail_percentage 25039 1726867450.56131: checking to see if all hosts have failed and the running result is not ok 25039 1726867450.56132: done checking to see if all hosts have failed 25039 1726867450.56133: getting the remaining hosts for this loop 25039 1726867450.56135: done getting the remaining hosts for this loop 25039 1726867450.56138: getting the next task for host managed_node1 25039 1726867450.56146: done getting next task for host managed_node1 25039 1726867450.56149: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25039 1726867450.56153: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867450.56165: getting variables 25039 1726867450.56167: in VariableManager get_vars() 25039 1726867450.56206: Calling all_inventory to load vars for managed_node1 25039 1726867450.56209: Calling groups_inventory to load vars for managed_node1 25039 1726867450.56211: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867450.56220: Calling all_plugins_play to load vars for managed_node1 25039 1726867450.56222: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867450.56225: Calling groups_plugins_play to load vars for managed_node1 25039 1726867450.56643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867450.57201: done with get_vars() 25039 1726867450.57212: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:24:10 -0400 (0:00:00.088) 0:00:08.099 ****** 25039 1726867450.57408: entering _queue_task() for managed_node1/stat 25039 1726867450.58065: worker is 1 (out of 1 available) 25039 1726867450.58081: exiting _queue_task() for managed_node1/stat 25039 1726867450.58095: done queuing things up, now waiting for results queue to drain 25039 1726867450.58096: waiting for pending results... 25039 1726867450.58337: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 25039 1726867450.58727: in run() - task 0affcac9-a3a5-3ddc-7272-0000000001fe 25039 1726867450.58745: variable 'ansible_search_path' from source: unknown 25039 1726867450.58752: variable 'ansible_search_path' from source: unknown 25039 1726867450.58793: calling self._execute() 25039 1726867450.58881: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.59182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.59186: variable 'omit' from source: magic vars 25039 1726867450.59963: variable 'ansible_distribution_major_version' from source: facts 25039 1726867450.60048: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867450.60321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867450.61235: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867450.61289: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867450.61338: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867450.61375: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867450.61599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867450.61635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867450.61716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867450.61799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867450.62004: variable '__network_is_ostree' from source: set_fact 25039 1726867450.62019: Evaluated conditional (not __network_is_ostree is defined): False 25039 1726867450.62095: when evaluation is False, skipping this task 25039 1726867450.62106: _execute() done 25039 1726867450.62119: dumping result to json 25039 1726867450.62128: done dumping result, returning 25039 1726867450.62142: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-3ddc-7272-0000000001fe] 25039 1726867450.62154: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001fe skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25039 1726867450.62315: no more pending results, returning what we have 25039 1726867450.62319: results queue empty 25039 1726867450.62320: checking for any_errors_fatal 25039 1726867450.62327: done checking for any_errors_fatal 25039 1726867450.62328: checking for max_fail_percentage 25039 1726867450.62329: done checking for max_fail_percentage 25039 1726867450.62331: checking to see if all hosts have failed and the running result is not ok 25039 1726867450.62332: done checking to see if all hosts have failed 25039 1726867450.62332: getting the remaining hosts for this loop 25039 1726867450.62334: done getting the remaining hosts for this loop 25039 1726867450.62337: getting the next task for host managed_node1 25039 1726867450.62345: done getting next task for host managed_node1 25039 1726867450.62349: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25039 1726867450.62353: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867450.62367: getting variables 25039 1726867450.62369: in VariableManager get_vars() 25039 1726867450.62414: Calling all_inventory to load vars for managed_node1 25039 1726867450.62417: Calling groups_inventory to load vars for managed_node1 25039 1726867450.62419: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867450.62429: Calling all_plugins_play to load vars for managed_node1 25039 1726867450.62431: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867450.62434: Calling groups_plugins_play to load vars for managed_node1 25039 1726867450.62803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867450.63586: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001fe 25039 1726867450.63589: WORKER PROCESS EXITING 25039 1726867450.63618: done with get_vars() 25039 1726867450.63630: done getting variables 25039 1726867450.63798: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:24:10 -0400 (0:00:00.064) 0:00:08.163 ****** 25039 1726867450.63835: entering _queue_task() for managed_node1/set_fact 25039 1726867450.64422: worker is 1 (out of 1 available) 25039 1726867450.64434: exiting _queue_task() for managed_node1/set_fact 25039 1726867450.64446: done queuing things up, now waiting for results queue to drain 25039 1726867450.64448: waiting for pending results... 25039 1726867450.65020: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25039 1726867450.65247: in run() - task 0affcac9-a3a5-3ddc-7272-0000000001ff 25039 1726867450.65267: variable 'ansible_search_path' from source: unknown 25039 1726867450.65324: variable 'ansible_search_path' from source: unknown 25039 1726867450.65365: calling self._execute() 25039 1726867450.65503: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.65591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.65605: variable 'omit' from source: magic vars 25039 1726867450.66285: variable 'ansible_distribution_major_version' from source: facts 25039 1726867450.66395: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867450.66598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867450.67233: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867450.67325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867450.67426: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867450.67467: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867450.67667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867450.67746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867450.67781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867450.67857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867450.68071: variable '__network_is_ostree' from source: set_fact 25039 1726867450.68086: Evaluated conditional (not __network_is_ostree is defined): False 25039 1726867450.68094: when evaluation is False, skipping this task 25039 1726867450.68102: _execute() done 25039 1726867450.68111: dumping result to json 25039 1726867450.68119: done dumping result, returning 25039 1726867450.68287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-3ddc-7272-0000000001ff] 25039 1726867450.68290: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001ff 25039 1726867450.68354: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000001ff 25039 1726867450.68358: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25039 1726867450.68438: no more pending results, returning what we have 25039 1726867450.68442: results queue empty 25039 1726867450.68443: checking for any_errors_fatal 25039 1726867450.68450: done checking for any_errors_fatal 25039 1726867450.68451: checking for max_fail_percentage 25039 1726867450.68454: done checking for max_fail_percentage 25039 1726867450.68455: checking to see if all hosts have failed and the running result is not ok 25039 1726867450.68456: done checking to see if all hosts have failed 25039 1726867450.68457: getting the remaining hosts for this loop 25039 1726867450.68458: done getting the remaining hosts for this loop 25039 1726867450.68462: getting the next task for host managed_node1 25039 1726867450.68472: done getting next task for host managed_node1 25039 1726867450.68479: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25039 1726867450.68483: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867450.68496: getting variables 25039 1726867450.68498: in VariableManager get_vars() 25039 1726867450.68540: Calling all_inventory to load vars for managed_node1 25039 1726867450.68543: Calling groups_inventory to load vars for managed_node1 25039 1726867450.68546: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867450.68557: Calling all_plugins_play to load vars for managed_node1 25039 1726867450.68560: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867450.68563: Calling groups_plugins_play to load vars for managed_node1 25039 1726867450.68918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867450.69622: done with get_vars() 25039 1726867450.69632: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:24:10 -0400 (0:00:00.061) 0:00:08.225 ****** 25039 1726867450.69974: entering _queue_task() for managed_node1/service_facts 25039 1726867450.69976: Creating lock for service_facts 25039 1726867450.70442: worker is 1 (out of 1 available) 25039 1726867450.70454: exiting _queue_task() for managed_node1/service_facts 25039 1726867450.70465: done queuing things up, now waiting for results queue to drain 25039 1726867450.70466: waiting for pending results... 25039 1726867450.71064: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 25039 1726867450.71262: in run() - task 0affcac9-a3a5-3ddc-7272-000000000201 25039 1726867450.71493: variable 'ansible_search_path' from source: unknown 25039 1726867450.71497: variable 'ansible_search_path' from source: unknown 25039 1726867450.71500: calling self._execute() 25039 1726867450.71685: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.71689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.71692: variable 'omit' from source: magic vars 25039 1726867450.72237: variable 'ansible_distribution_major_version' from source: facts 25039 1726867450.72372: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867450.72385: variable 'omit' from source: magic vars 25039 1726867450.72502: variable 'omit' from source: magic vars 25039 1726867450.72616: variable 'omit' from source: magic vars 25039 1726867450.72658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867450.72754: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867450.72812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867450.72997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867450.73060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867450.73264: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867450.73267: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.73269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.73404: Set connection var ansible_shell_executable to /bin/sh 25039 1726867450.73416: Set connection var ansible_timeout to 10 25039 1726867450.73426: Set connection var ansible_shell_type to sh 25039 1726867450.73431: Set connection var ansible_connection to ssh 25039 1726867450.73442: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867450.73450: Set connection var ansible_pipelining to False 25039 1726867450.73685: variable 'ansible_shell_executable' from source: unknown 25039 1726867450.73688: variable 'ansible_connection' from source: unknown 25039 1726867450.73691: variable 'ansible_module_compression' from source: unknown 25039 1726867450.73693: variable 'ansible_shell_type' from source: unknown 25039 1726867450.73697: variable 'ansible_shell_executable' from source: unknown 25039 1726867450.73699: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867450.73701: variable 'ansible_pipelining' from source: unknown 25039 1726867450.73703: variable 'ansible_timeout' from source: unknown 25039 1726867450.73705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867450.73954: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867450.74022: variable 'omit' from source: magic vars 25039 1726867450.74037: starting attempt loop 25039 1726867450.74045: running the handler 25039 1726867450.74065: _low_level_execute_command(): starting 25039 1726867450.74080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867450.75400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867450.75404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867450.75615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867450.75670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867450.75690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867450.75693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867450.75786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867450.77421: stdout chunk (state=3): >>>/root <<< 25039 1726867450.77555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867450.77566: stdout chunk (state=3): >>><<< 25039 1726867450.77582: stderr chunk (state=3): >>><<< 25039 1726867450.77607: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867450.77872: _low_level_execute_command(): starting 25039 1726867450.77876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474 `" && echo ansible-tmp-1726867450.7778435-25445-85683764386474="` echo /root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474 `" ) && sleep 0' 25039 1726867450.78804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867450.78808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867450.78810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867450.78820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867450.78823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867450.78958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867450.80891: stdout chunk (state=3): >>>ansible-tmp-1726867450.7778435-25445-85683764386474=/root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474 <<< 25039 1726867450.80994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867450.81027: stderr chunk (state=3): >>><<< 25039 1726867450.81036: stdout chunk (state=3): >>><<< 25039 1726867450.81101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867450.7778435-25445-85683764386474=/root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867450.81158: variable 'ansible_module_compression' from source: unknown 25039 1726867450.81284: ANSIBALLZ: Using lock for service_facts 25039 1726867450.81287: ANSIBALLZ: Acquiring lock 25039 1726867450.81290: ANSIBALLZ: Lock acquired: 140682440288704 25039 1726867450.81486: ANSIBALLZ: Creating module 25039 1726867451.00202: ANSIBALLZ: Writing module into payload 25039 1726867451.00298: ANSIBALLZ: Writing module 25039 1726867451.00322: ANSIBALLZ: Renaming module 25039 1726867451.00330: ANSIBALLZ: Done creating module 25039 1726867451.00382: variable 'ansible_facts' from source: unknown 25039 1726867451.00558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/AnsiballZ_service_facts.py 25039 1726867451.00602: Sending initial data 25039 1726867451.00605: Sent initial data (161 bytes) 25039 1726867451.01210: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867451.01219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867451.01228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867451.01298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867451.01347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867451.01360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867451.01383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867451.01520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867451.03121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867451.03181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867451.03268: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpk4gtms_u /root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/AnsiballZ_service_facts.py <<< 25039 1726867451.03287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/AnsiballZ_service_facts.py" <<< 25039 1726867451.03308: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpk4gtms_u" to remote "/root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/AnsiballZ_service_facts.py" <<< 25039 1726867451.04123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867451.04295: stderr chunk (state=3): >>><<< 25039 1726867451.04299: stdout chunk (state=3): >>><<< 25039 1726867451.04302: done transferring module to remote 25039 1726867451.04304: _low_level_execute_command(): starting 25039 1726867451.04306: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/ /root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/AnsiballZ_service_facts.py && sleep 0' 25039 1726867451.04891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867451.04906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867451.04931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867451.05041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867451.05072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867451.05308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867451.06926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867451.06986: stderr chunk (state=3): >>><<< 25039 1726867451.07002: stdout chunk (state=3): >>><<< 25039 1726867451.07184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867451.07188: _low_level_execute_command(): starting 25039 1726867451.07191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/AnsiballZ_service_facts.py && sleep 0' 25039 1726867451.07760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867451.07774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867451.07789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867451.07864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867451.07908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867451.07926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867451.07941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867451.08019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867452.62009: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source"<<< 25039 1726867452.62024: stdout chunk (state=3): >>>: "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25039 1726867452.63390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867452.63434: stderr chunk (state=3): >>><<< 25039 1726867452.63437: stdout chunk (state=3): >>><<< 25039 1726867452.63485: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867452.64172: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867452.64183: _low_level_execute_command(): starting 25039 1726867452.64189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867450.7778435-25445-85683764386474/ > /dev/null 2>&1 && sleep 0' 25039 1726867452.64859: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867452.64862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867452.64865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867452.64867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867452.64869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867452.64927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867452.64937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867452.64962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867452.65013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867452.66982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867452.66986: stdout chunk (state=3): >>><<< 25039 1726867452.66988: stderr chunk (state=3): >>><<< 25039 1726867452.66990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867452.66992: handler run complete 25039 1726867452.67052: variable 'ansible_facts' from source: unknown 25039 1726867452.67215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867452.67715: variable 'ansible_facts' from source: unknown 25039 1726867452.67870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867452.68097: attempt loop complete, returning result 25039 1726867452.68100: _execute() done 25039 1726867452.68103: dumping result to json 25039 1726867452.68168: done dumping result, returning 25039 1726867452.68179: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-3ddc-7272-000000000201] 25039 1726867452.68454: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000201 25039 1726867452.69627: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000201 25039 1726867452.69631: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867452.69755: no more pending results, returning what we have 25039 1726867452.69758: results queue empty 25039 1726867452.69759: checking for any_errors_fatal 25039 1726867452.69762: done checking for any_errors_fatal 25039 1726867452.69763: checking for max_fail_percentage 25039 1726867452.69764: done checking for max_fail_percentage 25039 1726867452.69765: checking to see if all hosts have failed and the running result is not ok 25039 1726867452.69766: done checking to see if all hosts have failed 25039 1726867452.69767: getting the remaining hosts for this loop 25039 1726867452.69768: done getting the remaining hosts for this loop 25039 1726867452.69771: getting the next task for host managed_node1 25039 1726867452.69778: done getting next task for host managed_node1 25039 1726867452.69782: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25039 1726867452.69787: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867452.69797: getting variables 25039 1726867452.69798: in VariableManager get_vars() 25039 1726867452.69830: Calling all_inventory to load vars for managed_node1 25039 1726867452.69833: Calling groups_inventory to load vars for managed_node1 25039 1726867452.69835: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867452.69844: Calling all_plugins_play to load vars for managed_node1 25039 1726867452.69846: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867452.69849: Calling groups_plugins_play to load vars for managed_node1 25039 1726867452.70672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867452.71230: done with get_vars() 25039 1726867452.71243: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:24:12 -0400 (0:00:02.013) 0:00:10.239 ****** 25039 1726867452.71348: entering _queue_task() for managed_node1/package_facts 25039 1726867452.71350: Creating lock for package_facts 25039 1726867452.71636: worker is 1 (out of 1 available) 25039 1726867452.71648: exiting _queue_task() for managed_node1/package_facts 25039 1726867452.71659: done queuing things up, now waiting for results queue to drain 25039 1726867452.71660: waiting for pending results... 25039 1726867452.72024: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 25039 1726867452.72087: in run() - task 0affcac9-a3a5-3ddc-7272-000000000202 25039 1726867452.72104: variable 'ansible_search_path' from source: unknown 25039 1726867452.72121: variable 'ansible_search_path' from source: unknown 25039 1726867452.72169: calling self._execute() 25039 1726867452.72268: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867452.72282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867452.72300: variable 'omit' from source: magic vars 25039 1726867452.72724: variable 'ansible_distribution_major_version' from source: facts 25039 1726867452.72740: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867452.72750: variable 'omit' from source: magic vars 25039 1726867452.72830: variable 'omit' from source: magic vars 25039 1726867452.72867: variable 'omit' from source: magic vars 25039 1726867452.72917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867452.72954: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867452.72984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867452.73095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867452.73098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867452.73100: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867452.73102: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867452.73104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867452.73172: Set connection var ansible_shell_executable to /bin/sh 25039 1726867452.73186: Set connection var ansible_timeout to 10 25039 1726867452.73203: Set connection var ansible_shell_type to sh 25039 1726867452.73218: Set connection var ansible_connection to ssh 25039 1726867452.73316: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867452.73321: Set connection var ansible_pipelining to False 25039 1726867452.73324: variable 'ansible_shell_executable' from source: unknown 25039 1726867452.73326: variable 'ansible_connection' from source: unknown 25039 1726867452.73328: variable 'ansible_module_compression' from source: unknown 25039 1726867452.73330: variable 'ansible_shell_type' from source: unknown 25039 1726867452.73333: variable 'ansible_shell_executable' from source: unknown 25039 1726867452.73335: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867452.73337: variable 'ansible_pipelining' from source: unknown 25039 1726867452.73339: variable 'ansible_timeout' from source: unknown 25039 1726867452.73341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867452.73847: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867452.73864: variable 'omit' from source: magic vars 25039 1726867452.73873: starting attempt loop 25039 1726867452.73882: running the handler 25039 1726867452.73903: _low_level_execute_command(): starting 25039 1726867452.73983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867452.75446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867452.75526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867452.75619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867452.75658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867452.75722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867452.77386: stdout chunk (state=3): >>>/root <<< 25039 1726867452.77788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867452.77792: stdout chunk (state=3): >>><<< 25039 1726867452.77795: stderr chunk (state=3): >>><<< 25039 1726867452.77798: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867452.77800: _low_level_execute_command(): starting 25039 1726867452.77802: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510 `" && echo ansible-tmp-1726867452.776173-25524-279255720556510="` echo /root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510 `" ) && sleep 0' 25039 1726867452.78669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867452.78672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867452.78971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867452.78995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867452.79080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867452.80981: stdout chunk (state=3): >>>ansible-tmp-1726867452.776173-25524-279255720556510=/root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510 <<< 25039 1726867452.81088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867452.81129: stderr chunk (state=3): >>><<< 25039 1726867452.81144: stdout chunk (state=3): >>><<< 25039 1726867452.81161: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867452.776173-25524-279255720556510=/root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867452.81218: variable 'ansible_module_compression' from source: unknown 25039 1726867452.81270: ANSIBALLZ: Using lock for package_facts 25039 1726867452.81279: ANSIBALLZ: Acquiring lock 25039 1726867452.81293: ANSIBALLZ: Lock acquired: 140682440292208 25039 1726867452.81301: ANSIBALLZ: Creating module 25039 1726867453.05188: ANSIBALLZ: Writing module into payload 25039 1726867453.05325: ANSIBALLZ: Writing module 25039 1726867453.05347: ANSIBALLZ: Renaming module 25039 1726867453.05359: ANSIBALLZ: Done creating module 25039 1726867453.05375: variable 'ansible_facts' from source: unknown 25039 1726867453.05563: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/AnsiballZ_package_facts.py 25039 1726867453.05799: Sending initial data 25039 1726867453.05802: Sent initial data (161 bytes) 25039 1726867453.06384: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867453.06387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867453.06471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867453.08124: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867453.08168: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867453.08223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpm6oi5ffp /root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/AnsiballZ_package_facts.py <<< 25039 1726867453.08227: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/AnsiballZ_package_facts.py" <<< 25039 1726867453.08270: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpm6oi5ffp" to remote "/root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/AnsiballZ_package_facts.py" <<< 25039 1726867453.09526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867453.09542: stderr chunk (state=3): >>><<< 25039 1726867453.09550: stdout chunk (state=3): >>><<< 25039 1726867453.09587: done transferring module to remote 25039 1726867453.09598: _low_level_execute_command(): starting 25039 1726867453.09647: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/ /root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/AnsiballZ_package_facts.py && sleep 0' 25039 1726867453.10030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867453.10033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867453.10036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867453.10039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867453.10041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867453.10085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867453.10104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867453.10146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867453.11903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867453.11927: stderr chunk (state=3): >>><<< 25039 1726867453.11930: stdout chunk (state=3): >>><<< 25039 1726867453.11948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867453.11952: _low_level_execute_command(): starting 25039 1726867453.11956: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/AnsiballZ_package_facts.py && sleep 0' 25039 1726867453.12337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867453.12364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867453.12369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867453.12372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867453.12375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867453.12378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867453.12429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867453.12435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867453.12486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867453.56632: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "clo<<< 25039 1726867453.56701: stdout chunk (state=3): >>>ud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25039 1726867453.58169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867453.58185: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 25039 1726867453.58240: stderr chunk (state=3): >>><<< 25039 1726867453.58254: stdout chunk (state=3): >>><<< 25039 1726867453.58291: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867453.61449: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867453.61482: _low_level_execute_command(): starting 25039 1726867453.61491: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867452.776173-25524-279255720556510/ > /dev/null 2>&1 && sleep 0' 25039 1726867453.62156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867453.62276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867453.62322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867453.62414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867453.64287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867453.64291: stdout chunk (state=3): >>><<< 25039 1726867453.64382: stderr chunk (state=3): >>><<< 25039 1726867453.64385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867453.64388: handler run complete 25039 1726867453.65159: variable 'ansible_facts' from source: unknown 25039 1726867453.65672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867453.67655: variable 'ansible_facts' from source: unknown 25039 1726867453.68073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867453.68797: attempt loop complete, returning result 25039 1726867453.68815: _execute() done 25039 1726867453.68819: dumping result to json 25039 1726867453.69026: done dumping result, returning 25039 1726867453.69030: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-3ddc-7272-000000000202] 25039 1726867453.69035: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000202 25039 1726867453.71743: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000202 25039 1726867453.71746: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867453.71841: no more pending results, returning what we have 25039 1726867453.71844: results queue empty 25039 1726867453.71845: checking for any_errors_fatal 25039 1726867453.71849: done checking for any_errors_fatal 25039 1726867453.71850: checking for max_fail_percentage 25039 1726867453.71851: done checking for max_fail_percentage 25039 1726867453.71852: checking to see if all hosts have failed and the running result is not ok 25039 1726867453.71853: done checking to see if all hosts have failed 25039 1726867453.71854: getting the remaining hosts for this loop 25039 1726867453.71855: done getting the remaining hosts for this loop 25039 1726867453.71858: getting the next task for host managed_node1 25039 1726867453.71866: done getting next task for host managed_node1 25039 1726867453.71869: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25039 1726867453.71872: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867453.71883: getting variables 25039 1726867453.71885: in VariableManager get_vars() 25039 1726867453.71917: Calling all_inventory to load vars for managed_node1 25039 1726867453.71920: Calling groups_inventory to load vars for managed_node1 25039 1726867453.71923: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867453.71932: Calling all_plugins_play to load vars for managed_node1 25039 1726867453.71934: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867453.71938: Calling groups_plugins_play to load vars for managed_node1 25039 1726867453.73808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867453.77634: done with get_vars() 25039 1726867453.77662: done getting variables 25039 1726867453.77726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:24:13 -0400 (0:00:01.064) 0:00:11.303 ****** 25039 1726867453.77758: entering _queue_task() for managed_node1/debug 25039 1726867453.78471: worker is 1 (out of 1 available) 25039 1726867453.78486: exiting _queue_task() for managed_node1/debug 25039 1726867453.78499: done queuing things up, now waiting for results queue to drain 25039 1726867453.78501: waiting for pending results... 25039 1726867453.79096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 25039 1726867453.79283: in run() - task 0affcac9-a3a5-3ddc-7272-000000000018 25039 1726867453.79287: variable 'ansible_search_path' from source: unknown 25039 1726867453.79290: variable 'ansible_search_path' from source: unknown 25039 1726867453.79442: calling self._execute() 25039 1726867453.79517: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867453.79525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867453.79535: variable 'omit' from source: magic vars 25039 1726867453.80433: variable 'ansible_distribution_major_version' from source: facts 25039 1726867453.80451: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867453.80644: variable 'omit' from source: magic vars 25039 1726867453.80648: variable 'omit' from source: magic vars 25039 1726867453.80812: variable 'network_provider' from source: set_fact 25039 1726867453.80838: variable 'omit' from source: magic vars 25039 1726867453.81155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867453.81158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867453.81161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867453.81163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867453.81165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867453.81168: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867453.81170: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867453.81172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867453.81437: Set connection var ansible_shell_executable to /bin/sh 25039 1726867453.81493: Set connection var ansible_timeout to 10 25039 1726867453.81503: Set connection var ansible_shell_type to sh 25039 1726867453.81513: Set connection var ansible_connection to ssh 25039 1726867453.81582: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867453.81598: Set connection var ansible_pipelining to False 25039 1726867453.81632: variable 'ansible_shell_executable' from source: unknown 25039 1726867453.81687: variable 'ansible_connection' from source: unknown 25039 1726867453.81699: variable 'ansible_module_compression' from source: unknown 25039 1726867453.81707: variable 'ansible_shell_type' from source: unknown 25039 1726867453.81717: variable 'ansible_shell_executable' from source: unknown 25039 1726867453.81727: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867453.81736: variable 'ansible_pipelining' from source: unknown 25039 1726867453.81784: variable 'ansible_timeout' from source: unknown 25039 1726867453.81795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867453.82137: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867453.82154: variable 'omit' from source: magic vars 25039 1726867453.82165: starting attempt loop 25039 1726867453.82173: running the handler 25039 1726867453.82354: handler run complete 25039 1726867453.82359: attempt loop complete, returning result 25039 1726867453.82368: _execute() done 25039 1726867453.82375: dumping result to json 25039 1726867453.82386: done dumping result, returning 25039 1726867453.82399: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-3ddc-7272-000000000018] 25039 1726867453.82414: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000018 ok: [managed_node1] => {} MSG: Using network provider: nm 25039 1726867453.82572: no more pending results, returning what we have 25039 1726867453.82576: results queue empty 25039 1726867453.82579: checking for any_errors_fatal 25039 1726867453.82589: done checking for any_errors_fatal 25039 1726867453.82590: checking for max_fail_percentage 25039 1726867453.82592: done checking for max_fail_percentage 25039 1726867453.82593: checking to see if all hosts have failed and the running result is not ok 25039 1726867453.82594: done checking to see if all hosts have failed 25039 1726867453.82595: getting the remaining hosts for this loop 25039 1726867453.82596: done getting the remaining hosts for this loop 25039 1726867453.82600: getting the next task for host managed_node1 25039 1726867453.82607: done getting next task for host managed_node1 25039 1726867453.82610: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25039 1726867453.82613: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867453.82623: getting variables 25039 1726867453.82624: in VariableManager get_vars() 25039 1726867453.82661: Calling all_inventory to load vars for managed_node1 25039 1726867453.82663: Calling groups_inventory to load vars for managed_node1 25039 1726867453.82665: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867453.82675: Calling all_plugins_play to load vars for managed_node1 25039 1726867453.82856: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867453.82863: Calling groups_plugins_play to load vars for managed_node1 25039 1726867453.83436: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000018 25039 1726867453.83439: WORKER PROCESS EXITING 25039 1726867453.85740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867453.89252: done with get_vars() 25039 1726867453.89281: done getting variables 25039 1726867453.89340: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:24:13 -0400 (0:00:00.116) 0:00:11.419 ****** 25039 1726867453.89380: entering _queue_task() for managed_node1/fail 25039 1726867453.89678: worker is 1 (out of 1 available) 25039 1726867453.89689: exiting _queue_task() for managed_node1/fail 25039 1726867453.89816: done queuing things up, now waiting for results queue to drain 25039 1726867453.89818: waiting for pending results... 25039 1726867453.89976: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25039 1726867453.90113: in run() - task 0affcac9-a3a5-3ddc-7272-000000000019 25039 1726867453.90137: variable 'ansible_search_path' from source: unknown 25039 1726867453.90146: variable 'ansible_search_path' from source: unknown 25039 1726867453.90189: calling self._execute() 25039 1726867453.90282: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867453.90294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867453.90310: variable 'omit' from source: magic vars 25039 1726867453.90694: variable 'ansible_distribution_major_version' from source: facts 25039 1726867453.90713: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867453.90841: variable 'network_state' from source: role '' defaults 25039 1726867453.90857: Evaluated conditional (network_state != {}): False 25039 1726867453.90864: when evaluation is False, skipping this task 25039 1726867453.90870: _execute() done 25039 1726867453.90875: dumping result to json 25039 1726867453.90883: done dumping result, returning 25039 1726867453.90896: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-3ddc-7272-000000000019] 25039 1726867453.90910: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000019 25039 1726867453.91185: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000019 25039 1726867453.91188: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867453.91261: no more pending results, returning what we have 25039 1726867453.91265: results queue empty 25039 1726867453.91267: checking for any_errors_fatal 25039 1726867453.91275: done checking for any_errors_fatal 25039 1726867453.91276: checking for max_fail_percentage 25039 1726867453.91281: done checking for max_fail_percentage 25039 1726867453.91282: checking to see if all hosts have failed and the running result is not ok 25039 1726867453.91283: done checking to see if all hosts have failed 25039 1726867453.91284: getting the remaining hosts for this loop 25039 1726867453.91286: done getting the remaining hosts for this loop 25039 1726867453.91290: getting the next task for host managed_node1 25039 1726867453.91297: done getting next task for host managed_node1 25039 1726867453.91301: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25039 1726867453.91305: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867453.91325: getting variables 25039 1726867453.91443: in VariableManager get_vars() 25039 1726867453.91506: Calling all_inventory to load vars for managed_node1 25039 1726867453.91512: Calling groups_inventory to load vars for managed_node1 25039 1726867453.91514: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867453.91523: Calling all_plugins_play to load vars for managed_node1 25039 1726867453.91526: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867453.91529: Calling groups_plugins_play to load vars for managed_node1 25039 1726867453.93133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867453.94835: done with get_vars() 25039 1726867453.94866: done getting variables 25039 1726867453.94938: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:24:13 -0400 (0:00:00.055) 0:00:11.475 ****** 25039 1726867453.94976: entering _queue_task() for managed_node1/fail 25039 1726867453.95309: worker is 1 (out of 1 available) 25039 1726867453.95322: exiting _queue_task() for managed_node1/fail 25039 1726867453.95335: done queuing things up, now waiting for results queue to drain 25039 1726867453.95337: waiting for pending results... 25039 1726867453.95553: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25039 1726867453.95904: in run() - task 0affcac9-a3a5-3ddc-7272-00000000001a 25039 1726867453.95912: variable 'ansible_search_path' from source: unknown 25039 1726867453.95915: variable 'ansible_search_path' from source: unknown 25039 1726867453.95919: calling self._execute() 25039 1726867453.95922: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867453.95948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867453.96000: variable 'omit' from source: magic vars 25039 1726867453.96675: variable 'ansible_distribution_major_version' from source: facts 25039 1726867453.96799: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867453.97182: variable 'network_state' from source: role '' defaults 25039 1726867453.97186: Evaluated conditional (network_state != {}): False 25039 1726867453.97189: when evaluation is False, skipping this task 25039 1726867453.97192: _execute() done 25039 1726867453.97194: dumping result to json 25039 1726867453.97197: done dumping result, returning 25039 1726867453.97200: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-3ddc-7272-00000000001a] 25039 1726867453.97203: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001a skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867453.97335: no more pending results, returning what we have 25039 1726867453.97340: results queue empty 25039 1726867453.97342: checking for any_errors_fatal 25039 1726867453.97351: done checking for any_errors_fatal 25039 1726867453.97352: checking for max_fail_percentage 25039 1726867453.97354: done checking for max_fail_percentage 25039 1726867453.97355: checking to see if all hosts have failed and the running result is not ok 25039 1726867453.97356: done checking to see if all hosts have failed 25039 1726867453.97356: getting the remaining hosts for this loop 25039 1726867453.97358: done getting the remaining hosts for this loop 25039 1726867453.97361: getting the next task for host managed_node1 25039 1726867453.97368: done getting next task for host managed_node1 25039 1726867453.97371: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25039 1726867453.97374: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867453.97391: getting variables 25039 1726867453.97393: in VariableManager get_vars() 25039 1726867453.97431: Calling all_inventory to load vars for managed_node1 25039 1726867453.97434: Calling groups_inventory to load vars for managed_node1 25039 1726867453.97436: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867453.97448: Calling all_plugins_play to load vars for managed_node1 25039 1726867453.97450: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867453.97453: Calling groups_plugins_play to load vars for managed_node1 25039 1726867453.98595: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001a 25039 1726867453.98599: WORKER PROCESS EXITING 25039 1726867453.99707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.01566: done with get_vars() 25039 1726867454.01591: done getting variables 25039 1726867454.01650: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:24:14 -0400 (0:00:00.067) 0:00:11.542 ****** 25039 1726867454.01684: entering _queue_task() for managed_node1/fail 25039 1726867454.02147: worker is 1 (out of 1 available) 25039 1726867454.02160: exiting _queue_task() for managed_node1/fail 25039 1726867454.02172: done queuing things up, now waiting for results queue to drain 25039 1726867454.02173: waiting for pending results... 25039 1726867454.02488: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25039 1726867454.02640: in run() - task 0affcac9-a3a5-3ddc-7272-00000000001b 25039 1726867454.02676: variable 'ansible_search_path' from source: unknown 25039 1726867454.02688: variable 'ansible_search_path' from source: unknown 25039 1726867454.02781: calling self._execute() 25039 1726867454.02829: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.02841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.02857: variable 'omit' from source: magic vars 25039 1726867454.03240: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.03259: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.03453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867454.05455: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867454.05507: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867454.05538: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867454.05564: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867454.05584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867454.05644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.05667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.05686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.05716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.05728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.05793: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.05806: Evaluated conditional (ansible_distribution_major_version | int > 9): True 25039 1726867454.05883: variable 'ansible_distribution' from source: facts 25039 1726867454.05887: variable '__network_rh_distros' from source: role '' defaults 25039 1726867454.05895: Evaluated conditional (ansible_distribution in __network_rh_distros): True 25039 1726867454.06050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.06067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.06087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.06116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.06126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.06160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.06175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.06194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.06223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.06233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.06264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.06280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.06299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.06327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.06337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.06528: variable 'network_connections' from source: task vars 25039 1726867454.06534: variable 'interface' from source: play vars 25039 1726867454.06599: variable 'interface' from source: play vars 25039 1726867454.06609: variable 'network_state' from source: role '' defaults 25039 1726867454.06740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867454.06869: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867454.06936: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867454.06983: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867454.07018: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867454.07061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867454.07097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867454.07193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.07196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867454.07199: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 25039 1726867454.07209: when evaluation is False, skipping this task 25039 1726867454.07216: _execute() done 25039 1726867454.07223: dumping result to json 25039 1726867454.07229: done dumping result, returning 25039 1726867454.07238: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-3ddc-7272-00000000001b] 25039 1726867454.07282: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001b 25039 1726867454.07360: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001b 25039 1726867454.07363: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 25039 1726867454.07411: no more pending results, returning what we have 25039 1726867454.07416: results queue empty 25039 1726867454.07417: checking for any_errors_fatal 25039 1726867454.07423: done checking for any_errors_fatal 25039 1726867454.07424: checking for max_fail_percentage 25039 1726867454.07425: done checking for max_fail_percentage 25039 1726867454.07426: checking to see if all hosts have failed and the running result is not ok 25039 1726867454.07427: done checking to see if all hosts have failed 25039 1726867454.07428: getting the remaining hosts for this loop 25039 1726867454.07429: done getting the remaining hosts for this loop 25039 1726867454.07433: getting the next task for host managed_node1 25039 1726867454.07441: done getting next task for host managed_node1 25039 1726867454.07445: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25039 1726867454.07448: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867454.07461: getting variables 25039 1726867454.07463: in VariableManager get_vars() 25039 1726867454.07558: Calling all_inventory to load vars for managed_node1 25039 1726867454.07561: Calling groups_inventory to load vars for managed_node1 25039 1726867454.07563: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867454.07574: Calling all_plugins_play to load vars for managed_node1 25039 1726867454.07576: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867454.07583: Calling groups_plugins_play to load vars for managed_node1 25039 1726867454.08409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.09282: done with get_vars() 25039 1726867454.09296: done getting variables 25039 1726867454.09365: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:24:14 -0400 (0:00:00.077) 0:00:11.619 ****** 25039 1726867454.09389: entering _queue_task() for managed_node1/dnf 25039 1726867454.09637: worker is 1 (out of 1 available) 25039 1726867454.09650: exiting _queue_task() for managed_node1/dnf 25039 1726867454.09662: done queuing things up, now waiting for results queue to drain 25039 1726867454.09664: waiting for pending results... 25039 1726867454.10010: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25039 1726867454.10014: in run() - task 0affcac9-a3a5-3ddc-7272-00000000001c 25039 1726867454.10018: variable 'ansible_search_path' from source: unknown 25039 1726867454.10021: variable 'ansible_search_path' from source: unknown 25039 1726867454.10041: calling self._execute() 25039 1726867454.10128: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.10140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.10154: variable 'omit' from source: magic vars 25039 1726867454.10541: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.10552: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.10753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867454.12536: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867454.12579: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867454.12609: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867454.12642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867454.12660: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867454.12742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.12761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.12779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.12819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.12837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.12945: variable 'ansible_distribution' from source: facts 25039 1726867454.12952: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.13183: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25039 1726867454.13186: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867454.13188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.13211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.13230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.13267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.13281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.13324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.13345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.13366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.13404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.13420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.13453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.13476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.13505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.13542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.13555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.13680: variable 'network_connections' from source: task vars 25039 1726867454.13696: variable 'interface' from source: play vars 25039 1726867454.13769: variable 'interface' from source: play vars 25039 1726867454.13819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867454.13992: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867454.14011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867454.14186: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867454.14190: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867454.14192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867454.14195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867454.14205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.14210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867454.14223: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867454.14446: variable 'network_connections' from source: task vars 25039 1726867454.14450: variable 'interface' from source: play vars 25039 1726867454.14514: variable 'interface' from source: play vars 25039 1726867454.14540: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25039 1726867454.14544: when evaluation is False, skipping this task 25039 1726867454.14547: _execute() done 25039 1726867454.14549: dumping result to json 25039 1726867454.14552: done dumping result, returning 25039 1726867454.14621: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-3ddc-7272-00000000001c] 25039 1726867454.14624: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001c 25039 1726867454.14689: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001c 25039 1726867454.14691: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25039 1726867454.14768: no more pending results, returning what we have 25039 1726867454.14866: results queue empty 25039 1726867454.14867: checking for any_errors_fatal 25039 1726867454.14874: done checking for any_errors_fatal 25039 1726867454.14874: checking for max_fail_percentage 25039 1726867454.14876: done checking for max_fail_percentage 25039 1726867454.14876: checking to see if all hosts have failed and the running result is not ok 25039 1726867454.14898: done checking to see if all hosts have failed 25039 1726867454.14899: getting the remaining hosts for this loop 25039 1726867454.14900: done getting the remaining hosts for this loop 25039 1726867454.14904: getting the next task for host managed_node1 25039 1726867454.14909: done getting next task for host managed_node1 25039 1726867454.14913: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25039 1726867454.14916: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867454.14929: getting variables 25039 1726867454.14930: in VariableManager get_vars() 25039 1726867454.14969: Calling all_inventory to load vars for managed_node1 25039 1726867454.14972: Calling groups_inventory to load vars for managed_node1 25039 1726867454.14974: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867454.14987: Calling all_plugins_play to load vars for managed_node1 25039 1726867454.14990: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867454.14994: Calling groups_plugins_play to load vars for managed_node1 25039 1726867454.15829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.16699: done with get_vars() 25039 1726867454.16717: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25039 1726867454.16769: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:24:14 -0400 (0:00:00.074) 0:00:11.693 ****** 25039 1726867454.16791: entering _queue_task() for managed_node1/yum 25039 1726867454.16793: Creating lock for yum 25039 1726867454.17086: worker is 1 (out of 1 available) 25039 1726867454.17098: exiting _queue_task() for managed_node1/yum 25039 1726867454.17115: done queuing things up, now waiting for results queue to drain 25039 1726867454.17117: waiting for pending results... 25039 1726867454.17494: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25039 1726867454.17537: in run() - task 0affcac9-a3a5-3ddc-7272-00000000001d 25039 1726867454.17557: variable 'ansible_search_path' from source: unknown 25039 1726867454.17567: variable 'ansible_search_path' from source: unknown 25039 1726867454.17618: calling self._execute() 25039 1726867454.17723: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.17785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.17790: variable 'omit' from source: magic vars 25039 1726867454.18041: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.18053: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.18183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867454.19654: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867454.19710: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867454.19739: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867454.19764: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867454.19787: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867454.19846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.19865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.19884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.19917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.19928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.19993: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.20007: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25039 1726867454.20011: when evaluation is False, skipping this task 25039 1726867454.20015: _execute() done 25039 1726867454.20023: dumping result to json 25039 1726867454.20026: done dumping result, returning 25039 1726867454.20029: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-3ddc-7272-00000000001d] 25039 1726867454.20035: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001d 25039 1726867454.20122: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001d 25039 1726867454.20126: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25039 1726867454.20171: no more pending results, returning what we have 25039 1726867454.20173: results queue empty 25039 1726867454.20174: checking for any_errors_fatal 25039 1726867454.20188: done checking for any_errors_fatal 25039 1726867454.20189: checking for max_fail_percentage 25039 1726867454.20190: done checking for max_fail_percentage 25039 1726867454.20191: checking to see if all hosts have failed and the running result is not ok 25039 1726867454.20192: done checking to see if all hosts have failed 25039 1726867454.20193: getting the remaining hosts for this loop 25039 1726867454.20194: done getting the remaining hosts for this loop 25039 1726867454.20197: getting the next task for host managed_node1 25039 1726867454.20204: done getting next task for host managed_node1 25039 1726867454.20207: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25039 1726867454.20210: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867454.20222: getting variables 25039 1726867454.20224: in VariableManager get_vars() 25039 1726867454.20261: Calling all_inventory to load vars for managed_node1 25039 1726867454.20263: Calling groups_inventory to load vars for managed_node1 25039 1726867454.20265: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867454.20274: Calling all_plugins_play to load vars for managed_node1 25039 1726867454.20279: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867454.20282: Calling groups_plugins_play to load vars for managed_node1 25039 1726867454.21045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.21993: done with get_vars() 25039 1726867454.22008: done getting variables 25039 1726867454.22050: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:24:14 -0400 (0:00:00.052) 0:00:11.746 ****** 25039 1726867454.22073: entering _queue_task() for managed_node1/fail 25039 1726867454.22286: worker is 1 (out of 1 available) 25039 1726867454.22300: exiting _queue_task() for managed_node1/fail 25039 1726867454.22313: done queuing things up, now waiting for results queue to drain 25039 1726867454.22315: waiting for pending results... 25039 1726867454.22478: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25039 1726867454.22568: in run() - task 0affcac9-a3a5-3ddc-7272-00000000001e 25039 1726867454.22581: variable 'ansible_search_path' from source: unknown 25039 1726867454.22585: variable 'ansible_search_path' from source: unknown 25039 1726867454.22615: calling self._execute() 25039 1726867454.22682: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.22686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.22695: variable 'omit' from source: magic vars 25039 1726867454.22945: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.22954: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.23038: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867454.23165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867454.24593: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867454.24649: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867454.24675: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867454.24702: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867454.24727: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867454.24783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.24803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.24825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.24851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.24862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.24895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.24911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.24933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.24959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.24969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.25000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.25019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.25040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.25061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.25072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.25186: variable 'network_connections' from source: task vars 25039 1726867454.25196: variable 'interface' from source: play vars 25039 1726867454.25248: variable 'interface' from source: play vars 25039 1726867454.25300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867454.25408: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867454.25436: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867454.25469: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867454.25493: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867454.25524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867454.25538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867454.25556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.25572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867454.25623: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867454.25772: variable 'network_connections' from source: task vars 25039 1726867454.25775: variable 'interface' from source: play vars 25039 1726867454.25826: variable 'interface' from source: play vars 25039 1726867454.25849: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25039 1726867454.25853: when evaluation is False, skipping this task 25039 1726867454.25856: _execute() done 25039 1726867454.25858: dumping result to json 25039 1726867454.25861: done dumping result, returning 25039 1726867454.25867: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-3ddc-7272-00000000001e] 25039 1726867454.25872: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001e 25039 1726867454.25958: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001e 25039 1726867454.25961: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25039 1726867454.26011: no more pending results, returning what we have 25039 1726867454.26015: results queue empty 25039 1726867454.26016: checking for any_errors_fatal 25039 1726867454.26021: done checking for any_errors_fatal 25039 1726867454.26022: checking for max_fail_percentage 25039 1726867454.26023: done checking for max_fail_percentage 25039 1726867454.26024: checking to see if all hosts have failed and the running result is not ok 25039 1726867454.26025: done checking to see if all hosts have failed 25039 1726867454.26026: getting the remaining hosts for this loop 25039 1726867454.26028: done getting the remaining hosts for this loop 25039 1726867454.26031: getting the next task for host managed_node1 25039 1726867454.26038: done getting next task for host managed_node1 25039 1726867454.26041: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25039 1726867454.26044: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867454.26058: getting variables 25039 1726867454.26059: in VariableManager get_vars() 25039 1726867454.26098: Calling all_inventory to load vars for managed_node1 25039 1726867454.26101: Calling groups_inventory to load vars for managed_node1 25039 1726867454.26103: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867454.26113: Calling all_plugins_play to load vars for managed_node1 25039 1726867454.26115: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867454.26118: Calling groups_plugins_play to load vars for managed_node1 25039 1726867454.26885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.27750: done with get_vars() 25039 1726867454.27766: done getting variables 25039 1726867454.27808: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:24:14 -0400 (0:00:00.057) 0:00:11.803 ****** 25039 1726867454.27832: entering _queue_task() for managed_node1/package 25039 1726867454.28045: worker is 1 (out of 1 available) 25039 1726867454.28059: exiting _queue_task() for managed_node1/package 25039 1726867454.28072: done queuing things up, now waiting for results queue to drain 25039 1726867454.28074: waiting for pending results... 25039 1726867454.28235: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 25039 1726867454.28315: in run() - task 0affcac9-a3a5-3ddc-7272-00000000001f 25039 1726867454.28326: variable 'ansible_search_path' from source: unknown 25039 1726867454.28329: variable 'ansible_search_path' from source: unknown 25039 1726867454.28356: calling self._execute() 25039 1726867454.28424: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.28429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.28437: variable 'omit' from source: magic vars 25039 1726867454.28685: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.28695: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.28826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867454.29016: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867454.29048: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867454.29076: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867454.29103: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867454.29180: variable 'network_packages' from source: role '' defaults 25039 1726867454.29250: variable '__network_provider_setup' from source: role '' defaults 25039 1726867454.29259: variable '__network_service_name_default_nm' from source: role '' defaults 25039 1726867454.29311: variable '__network_service_name_default_nm' from source: role '' defaults 25039 1726867454.29322: variable '__network_packages_default_nm' from source: role '' defaults 25039 1726867454.29364: variable '__network_packages_default_nm' from source: role '' defaults 25039 1726867454.29479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867454.30781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867454.30824: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867454.30859: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867454.30884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867454.30906: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867454.30964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.30986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.31004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.31036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.31047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.31079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.31095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.31112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.31141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.31151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.31290: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25039 1726867454.31362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.31380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.31396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.31423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.31433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.31496: variable 'ansible_python' from source: facts 25039 1726867454.31518: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25039 1726867454.31572: variable '__network_wpa_supplicant_required' from source: role '' defaults 25039 1726867454.31628: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25039 1726867454.31711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.31729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.31746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.31769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.31785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.31816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.31835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.31851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.31875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.31892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.31982: variable 'network_connections' from source: task vars 25039 1726867454.31987: variable 'interface' from source: play vars 25039 1726867454.32060: variable 'interface' from source: play vars 25039 1726867454.32112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867454.32135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867454.32154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.32174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867454.32210: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867454.32387: variable 'network_connections' from source: task vars 25039 1726867454.32390: variable 'interface' from source: play vars 25039 1726867454.32464: variable 'interface' from source: play vars 25039 1726867454.32503: variable '__network_packages_default_wireless' from source: role '' defaults 25039 1726867454.32560: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867454.32751: variable 'network_connections' from source: task vars 25039 1726867454.32756: variable 'interface' from source: play vars 25039 1726867454.32804: variable 'interface' from source: play vars 25039 1726867454.32825: variable '__network_packages_default_team' from source: role '' defaults 25039 1726867454.32882: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867454.33066: variable 'network_connections' from source: task vars 25039 1726867454.33069: variable 'interface' from source: play vars 25039 1726867454.33121: variable 'interface' from source: play vars 25039 1726867454.33161: variable '__network_service_name_default_initscripts' from source: role '' defaults 25039 1726867454.33207: variable '__network_service_name_default_initscripts' from source: role '' defaults 25039 1726867454.33215: variable '__network_packages_default_initscripts' from source: role '' defaults 25039 1726867454.33256: variable '__network_packages_default_initscripts' from source: role '' defaults 25039 1726867454.33388: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25039 1726867454.33925: variable 'network_connections' from source: task vars 25039 1726867454.33929: variable 'interface' from source: play vars 25039 1726867454.33975: variable 'interface' from source: play vars 25039 1726867454.33984: variable 'ansible_distribution' from source: facts 25039 1726867454.33987: variable '__network_rh_distros' from source: role '' defaults 25039 1726867454.33993: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.34009: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25039 1726867454.34118: variable 'ansible_distribution' from source: facts 25039 1726867454.34121: variable '__network_rh_distros' from source: role '' defaults 25039 1726867454.34126: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.34137: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25039 1726867454.34242: variable 'ansible_distribution' from source: facts 25039 1726867454.34246: variable '__network_rh_distros' from source: role '' defaults 25039 1726867454.34250: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.34275: variable 'network_provider' from source: set_fact 25039 1726867454.34293: variable 'ansible_facts' from source: unknown 25039 1726867454.37784: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25039 1726867454.37789: when evaluation is False, skipping this task 25039 1726867454.37791: _execute() done 25039 1726867454.37794: dumping result to json 25039 1726867454.37796: done dumping result, returning 25039 1726867454.37798: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-3ddc-7272-00000000001f] 25039 1726867454.37800: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001f 25039 1726867454.37876: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000001f 25039 1726867454.37883: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25039 1726867454.37932: no more pending results, returning what we have 25039 1726867454.37935: results queue empty 25039 1726867454.37936: checking for any_errors_fatal 25039 1726867454.37943: done checking for any_errors_fatal 25039 1726867454.37944: checking for max_fail_percentage 25039 1726867454.37945: done checking for max_fail_percentage 25039 1726867454.37946: checking to see if all hosts have failed and the running result is not ok 25039 1726867454.37947: done checking to see if all hosts have failed 25039 1726867454.37948: getting the remaining hosts for this loop 25039 1726867454.37949: done getting the remaining hosts for this loop 25039 1726867454.37954: getting the next task for host managed_node1 25039 1726867454.37960: done getting next task for host managed_node1 25039 1726867454.37963: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25039 1726867454.37966: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867454.37980: getting variables 25039 1726867454.37982: in VariableManager get_vars() 25039 1726867454.38017: Calling all_inventory to load vars for managed_node1 25039 1726867454.38020: Calling groups_inventory to load vars for managed_node1 25039 1726867454.38022: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867454.38031: Calling all_plugins_play to load vars for managed_node1 25039 1726867454.38034: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867454.38036: Calling groups_plugins_play to load vars for managed_node1 25039 1726867454.39085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.43362: done with get_vars() 25039 1726867454.43385: done getting variables 25039 1726867454.43434: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:24:14 -0400 (0:00:00.156) 0:00:11.960 ****** 25039 1726867454.43460: entering _queue_task() for managed_node1/package 25039 1726867454.43789: worker is 1 (out of 1 available) 25039 1726867454.43802: exiting _queue_task() for managed_node1/package 25039 1726867454.43816: done queuing things up, now waiting for results queue to drain 25039 1726867454.43818: waiting for pending results... 25039 1726867454.44199: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25039 1726867454.44236: in run() - task 0affcac9-a3a5-3ddc-7272-000000000020 25039 1726867454.44259: variable 'ansible_search_path' from source: unknown 25039 1726867454.44268: variable 'ansible_search_path' from source: unknown 25039 1726867454.44315: calling self._execute() 25039 1726867454.44399: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.44416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.44428: variable 'omit' from source: magic vars 25039 1726867454.44820: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.44842: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.44976: variable 'network_state' from source: role '' defaults 25039 1726867454.44995: Evaluated conditional (network_state != {}): False 25039 1726867454.45003: when evaluation is False, skipping this task 25039 1726867454.45014: _execute() done 25039 1726867454.45023: dumping result to json 25039 1726867454.45030: done dumping result, returning 25039 1726867454.45042: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-3ddc-7272-000000000020] 25039 1726867454.45056: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000020 25039 1726867454.45260: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000020 25039 1726867454.45263: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867454.45319: no more pending results, returning what we have 25039 1726867454.45323: results queue empty 25039 1726867454.45324: checking for any_errors_fatal 25039 1726867454.45335: done checking for any_errors_fatal 25039 1726867454.45336: checking for max_fail_percentage 25039 1726867454.45338: done checking for max_fail_percentage 25039 1726867454.45339: checking to see if all hosts have failed and the running result is not ok 25039 1726867454.45340: done checking to see if all hosts have failed 25039 1726867454.45341: getting the remaining hosts for this loop 25039 1726867454.45342: done getting the remaining hosts for this loop 25039 1726867454.45346: getting the next task for host managed_node1 25039 1726867454.45355: done getting next task for host managed_node1 25039 1726867454.45359: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25039 1726867454.45362: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867454.45381: getting variables 25039 1726867454.45383: in VariableManager get_vars() 25039 1726867454.45429: Calling all_inventory to load vars for managed_node1 25039 1726867454.45432: Calling groups_inventory to load vars for managed_node1 25039 1726867454.45435: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867454.45449: Calling all_plugins_play to load vars for managed_node1 25039 1726867454.45452: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867454.45455: Calling groups_plugins_play to load vars for managed_node1 25039 1726867454.46936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.48548: done with get_vars() 25039 1726867454.48569: done getting variables 25039 1726867454.48632: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:24:14 -0400 (0:00:00.052) 0:00:12.012 ****** 25039 1726867454.48665: entering _queue_task() for managed_node1/package 25039 1726867454.48970: worker is 1 (out of 1 available) 25039 1726867454.48983: exiting _queue_task() for managed_node1/package 25039 1726867454.48997: done queuing things up, now waiting for results queue to drain 25039 1726867454.48999: waiting for pending results... 25039 1726867454.49395: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25039 1726867454.49400: in run() - task 0affcac9-a3a5-3ddc-7272-000000000021 25039 1726867454.49418: variable 'ansible_search_path' from source: unknown 25039 1726867454.49426: variable 'ansible_search_path' from source: unknown 25039 1726867454.49465: calling self._execute() 25039 1726867454.49561: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.49574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.49593: variable 'omit' from source: magic vars 25039 1726867454.49974: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.49991: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.50126: variable 'network_state' from source: role '' defaults 25039 1726867454.50149: Evaluated conditional (network_state != {}): False 25039 1726867454.50160: when evaluation is False, skipping this task 25039 1726867454.50168: _execute() done 25039 1726867454.50175: dumping result to json 25039 1726867454.50184: done dumping result, returning 25039 1726867454.50195: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-3ddc-7272-000000000021] 25039 1726867454.50204: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867454.50479: no more pending results, returning what we have 25039 1726867454.50483: results queue empty 25039 1726867454.50485: checking for any_errors_fatal 25039 1726867454.50491: done checking for any_errors_fatal 25039 1726867454.50492: checking for max_fail_percentage 25039 1726867454.50494: done checking for max_fail_percentage 25039 1726867454.50495: checking to see if all hosts have failed and the running result is not ok 25039 1726867454.50496: done checking to see if all hosts have failed 25039 1726867454.50497: getting the remaining hosts for this loop 25039 1726867454.50498: done getting the remaining hosts for this loop 25039 1726867454.50502: getting the next task for host managed_node1 25039 1726867454.50511: done getting next task for host managed_node1 25039 1726867454.50514: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25039 1726867454.50518: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867454.50531: getting variables 25039 1726867454.50533: in VariableManager get_vars() 25039 1726867454.50573: Calling all_inventory to load vars for managed_node1 25039 1726867454.50575: Calling groups_inventory to load vars for managed_node1 25039 1726867454.50580: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867454.50593: Calling all_plugins_play to load vars for managed_node1 25039 1726867454.50595: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867454.50598: Calling groups_plugins_play to load vars for managed_node1 25039 1726867454.51190: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000021 25039 1726867454.51193: WORKER PROCESS EXITING 25039 1726867454.52181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.53710: done with get_vars() 25039 1726867454.53735: done getting variables 25039 1726867454.53844: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:24:14 -0400 (0:00:00.052) 0:00:12.064 ****** 25039 1726867454.53881: entering _queue_task() for managed_node1/service 25039 1726867454.53883: Creating lock for service 25039 1726867454.54230: worker is 1 (out of 1 available) 25039 1726867454.54242: exiting _queue_task() for managed_node1/service 25039 1726867454.54255: done queuing things up, now waiting for results queue to drain 25039 1726867454.54257: waiting for pending results... 25039 1726867454.54544: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25039 1726867454.54684: in run() - task 0affcac9-a3a5-3ddc-7272-000000000022 25039 1726867454.54706: variable 'ansible_search_path' from source: unknown 25039 1726867454.54719: variable 'ansible_search_path' from source: unknown 25039 1726867454.54757: calling self._execute() 25039 1726867454.54850: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.54862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.54875: variable 'omit' from source: magic vars 25039 1726867454.55241: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.55258: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.55386: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867454.55588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867454.57843: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867454.57935: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867454.57976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867454.58021: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867454.58058: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867454.58145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.58182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.58212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.58260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.58282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.58334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.58466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.58470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.58473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.58476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.58502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.58533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.58562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.58613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.58634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.58828: variable 'network_connections' from source: task vars 25039 1726867454.58846: variable 'interface' from source: play vars 25039 1726867454.58931: variable 'interface' from source: play vars 25039 1726867454.59012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867454.59189: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867454.59248: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867454.59353: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867454.59357: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867454.59370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867454.59397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867454.59428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.59458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867454.59523: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867454.59791: variable 'network_connections' from source: task vars 25039 1726867454.59794: variable 'interface' from source: play vars 25039 1726867454.59851: variable 'interface' from source: play vars 25039 1726867454.59900: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25039 1726867454.59904: when evaluation is False, skipping this task 25039 1726867454.59982: _execute() done 25039 1726867454.59985: dumping result to json 25039 1726867454.59988: done dumping result, returning 25039 1726867454.59990: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-3ddc-7272-000000000022] 25039 1726867454.59992: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000022 25039 1726867454.60067: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000022 25039 1726867454.60079: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25039 1726867454.60129: no more pending results, returning what we have 25039 1726867454.60132: results queue empty 25039 1726867454.60133: checking for any_errors_fatal 25039 1726867454.60140: done checking for any_errors_fatal 25039 1726867454.60141: checking for max_fail_percentage 25039 1726867454.60142: done checking for max_fail_percentage 25039 1726867454.60143: checking to see if all hosts have failed and the running result is not ok 25039 1726867454.60144: done checking to see if all hosts have failed 25039 1726867454.60145: getting the remaining hosts for this loop 25039 1726867454.60146: done getting the remaining hosts for this loop 25039 1726867454.60150: getting the next task for host managed_node1 25039 1726867454.60157: done getting next task for host managed_node1 25039 1726867454.60161: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25039 1726867454.60164: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867454.60179: getting variables 25039 1726867454.60181: in VariableManager get_vars() 25039 1726867454.60226: Calling all_inventory to load vars for managed_node1 25039 1726867454.60229: Calling groups_inventory to load vars for managed_node1 25039 1726867454.60232: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867454.60244: Calling all_plugins_play to load vars for managed_node1 25039 1726867454.60247: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867454.60249: Calling groups_plugins_play to load vars for managed_node1 25039 1726867454.61850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867454.63398: done with get_vars() 25039 1726867454.63424: done getting variables 25039 1726867454.63488: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:24:14 -0400 (0:00:00.096) 0:00:12.160 ****** 25039 1726867454.63523: entering _queue_task() for managed_node1/service 25039 1726867454.63842: worker is 1 (out of 1 available) 25039 1726867454.63854: exiting _queue_task() for managed_node1/service 25039 1726867454.63867: done queuing things up, now waiting for results queue to drain 25039 1726867454.63868: waiting for pending results... 25039 1726867454.64297: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25039 1726867454.64302: in run() - task 0affcac9-a3a5-3ddc-7272-000000000023 25039 1726867454.64306: variable 'ansible_search_path' from source: unknown 25039 1726867454.64311: variable 'ansible_search_path' from source: unknown 25039 1726867454.64353: calling self._execute() 25039 1726867454.64450: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.64463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.64474: variable 'omit' from source: magic vars 25039 1726867454.64858: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.64939: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867454.65045: variable 'network_provider' from source: set_fact 25039 1726867454.65057: variable 'network_state' from source: role '' defaults 25039 1726867454.65070: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25039 1726867454.65083: variable 'omit' from source: magic vars 25039 1726867454.65136: variable 'omit' from source: magic vars 25039 1726867454.65269: variable 'network_service_name' from source: role '' defaults 25039 1726867454.65273: variable 'network_service_name' from source: role '' defaults 25039 1726867454.65354: variable '__network_provider_setup' from source: role '' defaults 25039 1726867454.65366: variable '__network_service_name_default_nm' from source: role '' defaults 25039 1726867454.65437: variable '__network_service_name_default_nm' from source: role '' defaults 25039 1726867454.65451: variable '__network_packages_default_nm' from source: role '' defaults 25039 1726867454.65520: variable '__network_packages_default_nm' from source: role '' defaults 25039 1726867454.65754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867454.68153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867454.68238: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867454.68276: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867454.68322: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867454.68353: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867454.68438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.68470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.68500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.68684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.68687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.68689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.68691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.68693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.68710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.68730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.68965: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25039 1726867454.69099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.69134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.69163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.69205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.69233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.69332: variable 'ansible_python' from source: facts 25039 1726867454.69363: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25039 1726867454.69461: variable '__network_wpa_supplicant_required' from source: role '' defaults 25039 1726867454.69549: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25039 1726867454.69685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.69719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.69748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.69802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.69885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.69888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867454.69924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867454.69957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.70012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867454.70034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867454.70191: variable 'network_connections' from source: task vars 25039 1726867454.70214: variable 'interface' from source: play vars 25039 1726867454.70324: variable 'interface' from source: play vars 25039 1726867454.70410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867454.70623: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867454.70683: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867454.70730: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867454.70865: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867454.70869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867454.70879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867454.70918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867454.70954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867454.71010: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867454.71293: variable 'network_connections' from source: task vars 25039 1726867454.71311: variable 'interface' from source: play vars 25039 1726867454.71387: variable 'interface' from source: play vars 25039 1726867454.71484: variable '__network_packages_default_wireless' from source: role '' defaults 25039 1726867454.71534: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867454.71833: variable 'network_connections' from source: task vars 25039 1726867454.71847: variable 'interface' from source: play vars 25039 1726867454.71922: variable 'interface' from source: play vars 25039 1726867454.71955: variable '__network_packages_default_team' from source: role '' defaults 25039 1726867454.72040: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867454.72385: variable 'network_connections' from source: task vars 25039 1726867454.72388: variable 'interface' from source: play vars 25039 1726867454.72428: variable 'interface' from source: play vars 25039 1726867454.72486: variable '__network_service_name_default_initscripts' from source: role '' defaults 25039 1726867454.72546: variable '__network_service_name_default_initscripts' from source: role '' defaults 25039 1726867454.72556: variable '__network_packages_default_initscripts' from source: role '' defaults 25039 1726867454.72618: variable '__network_packages_default_initscripts' from source: role '' defaults 25039 1726867454.72841: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25039 1726867454.73366: variable 'network_connections' from source: task vars 25039 1726867454.73370: variable 'interface' from source: play vars 25039 1726867454.73475: variable 'interface' from source: play vars 25039 1726867454.73482: variable 'ansible_distribution' from source: facts 25039 1726867454.73484: variable '__network_rh_distros' from source: role '' defaults 25039 1726867454.73486: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.73493: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25039 1726867454.73674: variable 'ansible_distribution' from source: facts 25039 1726867454.73686: variable '__network_rh_distros' from source: role '' defaults 25039 1726867454.73701: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.73722: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25039 1726867454.73913: variable 'ansible_distribution' from source: facts 25039 1726867454.73916: variable '__network_rh_distros' from source: role '' defaults 25039 1726867454.73918: variable 'ansible_distribution_major_version' from source: facts 25039 1726867454.73954: variable 'network_provider' from source: set_fact 25039 1726867454.74082: variable 'omit' from source: magic vars 25039 1726867454.74085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867454.74088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867454.74091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867454.74093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867454.74104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867454.74142: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867454.74150: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.74159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.74268: Set connection var ansible_shell_executable to /bin/sh 25039 1726867454.74282: Set connection var ansible_timeout to 10 25039 1726867454.74293: Set connection var ansible_shell_type to sh 25039 1726867454.74300: Set connection var ansible_connection to ssh 25039 1726867454.74319: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867454.74329: Set connection var ansible_pipelining to False 25039 1726867454.74360: variable 'ansible_shell_executable' from source: unknown 25039 1726867454.74424: variable 'ansible_connection' from source: unknown 25039 1726867454.74427: variable 'ansible_module_compression' from source: unknown 25039 1726867454.74430: variable 'ansible_shell_type' from source: unknown 25039 1726867454.74432: variable 'ansible_shell_executable' from source: unknown 25039 1726867454.74434: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867454.74437: variable 'ansible_pipelining' from source: unknown 25039 1726867454.74439: variable 'ansible_timeout' from source: unknown 25039 1726867454.74441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867454.74523: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867454.74544: variable 'omit' from source: magic vars 25039 1726867454.74560: starting attempt loop 25039 1726867454.74567: running the handler 25039 1726867454.74654: variable 'ansible_facts' from source: unknown 25039 1726867454.75484: _low_level_execute_command(): starting 25039 1726867454.75487: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867454.76020: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867454.76028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867454.76035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867454.76061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867454.76065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867454.76067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867454.76122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867454.76125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867454.76186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867454.77855: stdout chunk (state=3): >>>/root <<< 25039 1726867454.78067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867454.78070: stdout chunk (state=3): >>><<< 25039 1726867454.78073: stderr chunk (state=3): >>><<< 25039 1726867454.78095: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867454.78183: _low_level_execute_command(): starting 25039 1726867454.78186: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843 `" && echo ansible-tmp-1726867454.7810142-25601-19255922403843="` echo /root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843 `" ) && sleep 0' 25039 1726867454.78569: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867454.78573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867454.78587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867454.78649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867454.78653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867454.78702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867454.80574: stdout chunk (state=3): >>>ansible-tmp-1726867454.7810142-25601-19255922403843=/root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843 <<< 25039 1726867454.80882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867454.80887: stdout chunk (state=3): >>><<< 25039 1726867454.80890: stderr chunk (state=3): >>><<< 25039 1726867454.80893: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867454.7810142-25601-19255922403843=/root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867454.80896: variable 'ansible_module_compression' from source: unknown 25039 1726867454.80900: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 25039 1726867454.80903: ANSIBALLZ: Acquiring lock 25039 1726867454.80906: ANSIBALLZ: Lock acquired: 140682442827552 25039 1726867454.80911: ANSIBALLZ: Creating module 25039 1726867455.02033: ANSIBALLZ: Writing module into payload 25039 1726867455.02140: ANSIBALLZ: Writing module 25039 1726867455.02163: ANSIBALLZ: Renaming module 25039 1726867455.02169: ANSIBALLZ: Done creating module 25039 1726867455.02202: variable 'ansible_facts' from source: unknown 25039 1726867455.02339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/AnsiballZ_systemd.py 25039 1726867455.02445: Sending initial data 25039 1726867455.02449: Sent initial data (155 bytes) 25039 1726867455.02914: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867455.02922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867455.02925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867455.02928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867455.02931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867455.02989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867455.02992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867455.02995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867455.03060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867455.04713: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867455.04773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867455.04830: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp7xb6ymi2 /root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/AnsiballZ_systemd.py <<< 25039 1726867455.04833: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/AnsiballZ_systemd.py" <<< 25039 1726867455.04887: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp7xb6ymi2" to remote "/root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/AnsiballZ_systemd.py" <<< 25039 1726867455.06121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867455.06183: stderr chunk (state=3): >>><<< 25039 1726867455.06186: stdout chunk (state=3): >>><<< 25039 1726867455.06188: done transferring module to remote 25039 1726867455.06190: _low_level_execute_command(): starting 25039 1726867455.06193: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/ /root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/AnsiballZ_systemd.py && sleep 0' 25039 1726867455.06581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867455.06617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867455.06620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867455.06622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867455.06624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867455.06626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867455.06630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867455.06668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867455.06673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867455.06742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867455.08562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867455.08565: stdout chunk (state=3): >>><<< 25039 1726867455.08567: stderr chunk (state=3): >>><<< 25039 1726867455.08589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867455.08665: _low_level_execute_command(): starting 25039 1726867455.08669: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/AnsiballZ_systemd.py && sleep 0' 25039 1726867455.09185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867455.09199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867455.09222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867455.09322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867455.09350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867455.09364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867455.09450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867455.38115: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10866688", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296878592", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1550364000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 25039 1726867455.38158: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-b<<< 25039 1726867455.38169: stdout chunk (state=3): >>>roker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25039 1726867455.39940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867455.39970: stderr chunk (state=3): >>><<< 25039 1726867455.39973: stdout chunk (state=3): >>><<< 25039 1726867455.39992: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10866688", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296878592", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1550364000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867455.40114: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867455.40130: _low_level_execute_command(): starting 25039 1726867455.40134: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867454.7810142-25601-19255922403843/ > /dev/null 2>&1 && sleep 0' 25039 1726867455.40570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867455.40574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867455.40612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867455.40615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867455.40618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867455.40620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867455.40622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867455.40673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867455.40681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867455.40684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867455.40724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867455.42545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867455.42549: stderr chunk (state=3): >>><<< 25039 1726867455.42551: stdout chunk (state=3): >>><<< 25039 1726867455.42570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867455.42573: handler run complete 25039 1726867455.42616: attempt loop complete, returning result 25039 1726867455.42620: _execute() done 25039 1726867455.42622: dumping result to json 25039 1726867455.42634: done dumping result, returning 25039 1726867455.42642: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-3ddc-7272-000000000023] 25039 1726867455.42646: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000023 25039 1726867455.42875: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000023 25039 1726867455.42880: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867455.42931: no more pending results, returning what we have 25039 1726867455.42934: results queue empty 25039 1726867455.42935: checking for any_errors_fatal 25039 1726867455.42940: done checking for any_errors_fatal 25039 1726867455.42941: checking for max_fail_percentage 25039 1726867455.42943: done checking for max_fail_percentage 25039 1726867455.42943: checking to see if all hosts have failed and the running result is not ok 25039 1726867455.42944: done checking to see if all hosts have failed 25039 1726867455.42945: getting the remaining hosts for this loop 25039 1726867455.42947: done getting the remaining hosts for this loop 25039 1726867455.42950: getting the next task for host managed_node1 25039 1726867455.42955: done getting next task for host managed_node1 25039 1726867455.42959: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25039 1726867455.42961: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867455.42970: getting variables 25039 1726867455.42971: in VariableManager get_vars() 25039 1726867455.43015: Calling all_inventory to load vars for managed_node1 25039 1726867455.43018: Calling groups_inventory to load vars for managed_node1 25039 1726867455.43020: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867455.43030: Calling all_plugins_play to load vars for managed_node1 25039 1726867455.43032: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867455.43035: Calling groups_plugins_play to load vars for managed_node1 25039 1726867455.43897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867455.45251: done with get_vars() 25039 1726867455.45269: done getting variables 25039 1726867455.45319: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:24:15 -0400 (0:00:00.818) 0:00:12.979 ****** 25039 1726867455.45342: entering _queue_task() for managed_node1/service 25039 1726867455.45557: worker is 1 (out of 1 available) 25039 1726867455.45569: exiting _queue_task() for managed_node1/service 25039 1726867455.45583: done queuing things up, now waiting for results queue to drain 25039 1726867455.45585: waiting for pending results... 25039 1726867455.45753: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25039 1726867455.45835: in run() - task 0affcac9-a3a5-3ddc-7272-000000000024 25039 1726867455.45846: variable 'ansible_search_path' from source: unknown 25039 1726867455.45850: variable 'ansible_search_path' from source: unknown 25039 1726867455.45879: calling self._execute() 25039 1726867455.45949: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867455.45953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867455.45962: variable 'omit' from source: magic vars 25039 1726867455.46227: variable 'ansible_distribution_major_version' from source: facts 25039 1726867455.46237: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867455.46320: variable 'network_provider' from source: set_fact 25039 1726867455.46324: Evaluated conditional (network_provider == "nm"): True 25039 1726867455.46390: variable '__network_wpa_supplicant_required' from source: role '' defaults 25039 1726867455.46452: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25039 1726867455.46569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867455.48482: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867455.48486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867455.48499: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867455.48542: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867455.48573: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867455.48664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867455.48699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867455.48733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867455.48776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867455.48799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867455.48849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867455.48880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867455.48914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867455.48956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867455.48974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867455.49029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867455.49282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867455.49286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867455.49288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867455.49290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867455.49292: variable 'network_connections' from source: task vars 25039 1726867455.49294: variable 'interface' from source: play vars 25039 1726867455.49364: variable 'interface' from source: play vars 25039 1726867455.49445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867455.49611: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867455.49651: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867455.49689: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867455.49723: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867455.49769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867455.49798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867455.49829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867455.49856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867455.49905: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867455.50151: variable 'network_connections' from source: task vars 25039 1726867455.50162: variable 'interface' from source: play vars 25039 1726867455.50228: variable 'interface' from source: play vars 25039 1726867455.50272: Evaluated conditional (__network_wpa_supplicant_required): False 25039 1726867455.50281: when evaluation is False, skipping this task 25039 1726867455.50289: _execute() done 25039 1726867455.50297: dumping result to json 25039 1726867455.50307: done dumping result, returning 25039 1726867455.50318: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-3ddc-7272-000000000024] 25039 1726867455.50340: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000024 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25039 1726867455.50490: no more pending results, returning what we have 25039 1726867455.50493: results queue empty 25039 1726867455.50494: checking for any_errors_fatal 25039 1726867455.50518: done checking for any_errors_fatal 25039 1726867455.50519: checking for max_fail_percentage 25039 1726867455.50520: done checking for max_fail_percentage 25039 1726867455.50521: checking to see if all hosts have failed and the running result is not ok 25039 1726867455.50522: done checking to see if all hosts have failed 25039 1726867455.50522: getting the remaining hosts for this loop 25039 1726867455.50524: done getting the remaining hosts for this loop 25039 1726867455.50527: getting the next task for host managed_node1 25039 1726867455.50533: done getting next task for host managed_node1 25039 1726867455.50537: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25039 1726867455.50539: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867455.50552: getting variables 25039 1726867455.50554: in VariableManager get_vars() 25039 1726867455.50592: Calling all_inventory to load vars for managed_node1 25039 1726867455.50595: Calling groups_inventory to load vars for managed_node1 25039 1726867455.50597: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867455.50607: Calling all_plugins_play to load vars for managed_node1 25039 1726867455.50610: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867455.50612: Calling groups_plugins_play to load vars for managed_node1 25039 1726867455.51293: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000024 25039 1726867455.51297: WORKER PROCESS EXITING 25039 1726867455.52221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867455.54117: done with get_vars() 25039 1726867455.54253: done getting variables 25039 1726867455.54454: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:24:15 -0400 (0:00:00.091) 0:00:13.070 ****** 25039 1726867455.54487: entering _queue_task() for managed_node1/service 25039 1726867455.55051: worker is 1 (out of 1 available) 25039 1726867455.55063: exiting _queue_task() for managed_node1/service 25039 1726867455.55076: done queuing things up, now waiting for results queue to drain 25039 1726867455.55079: waiting for pending results... 25039 1726867455.55470: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 25039 1726867455.55564: in run() - task 0affcac9-a3a5-3ddc-7272-000000000025 25039 1726867455.55568: variable 'ansible_search_path' from source: unknown 25039 1726867455.55583: variable 'ansible_search_path' from source: unknown 25039 1726867455.55624: calling self._execute() 25039 1726867455.55770: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867455.55774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867455.55778: variable 'omit' from source: magic vars 25039 1726867455.56145: variable 'ansible_distribution_major_version' from source: facts 25039 1726867455.56162: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867455.56283: variable 'network_provider' from source: set_fact 25039 1726867455.56294: Evaluated conditional (network_provider == "initscripts"): False 25039 1726867455.56301: when evaluation is False, skipping this task 25039 1726867455.56313: _execute() done 25039 1726867455.56320: dumping result to json 25039 1726867455.56382: done dumping result, returning 25039 1726867455.56386: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-3ddc-7272-000000000025] 25039 1726867455.56389: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000025 25039 1726867455.56559: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000025 25039 1726867455.56562: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867455.56611: no more pending results, returning what we have 25039 1726867455.56616: results queue empty 25039 1726867455.56617: checking for any_errors_fatal 25039 1726867455.56626: done checking for any_errors_fatal 25039 1726867455.56627: checking for max_fail_percentage 25039 1726867455.56743: done checking for max_fail_percentage 25039 1726867455.56745: checking to see if all hosts have failed and the running result is not ok 25039 1726867455.56746: done checking to see if all hosts have failed 25039 1726867455.56747: getting the remaining hosts for this loop 25039 1726867455.56748: done getting the remaining hosts for this loop 25039 1726867455.56751: getting the next task for host managed_node1 25039 1726867455.56757: done getting next task for host managed_node1 25039 1726867455.56761: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25039 1726867455.56763: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867455.56776: getting variables 25039 1726867455.56780: in VariableManager get_vars() 25039 1726867455.56815: Calling all_inventory to load vars for managed_node1 25039 1726867455.56818: Calling groups_inventory to load vars for managed_node1 25039 1726867455.56821: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867455.56832: Calling all_plugins_play to load vars for managed_node1 25039 1726867455.56835: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867455.56838: Calling groups_plugins_play to load vars for managed_node1 25039 1726867455.58248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867455.60899: done with get_vars() 25039 1726867455.60924: done getting variables 25039 1726867455.61107: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:24:15 -0400 (0:00:00.066) 0:00:13.137 ****** 25039 1726867455.61143: entering _queue_task() for managed_node1/copy 25039 1726867455.61946: worker is 1 (out of 1 available) 25039 1726867455.61956: exiting _queue_task() for managed_node1/copy 25039 1726867455.61968: done queuing things up, now waiting for results queue to drain 25039 1726867455.61969: waiting for pending results... 25039 1726867455.62498: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25039 1726867455.62583: in run() - task 0affcac9-a3a5-3ddc-7272-000000000026 25039 1726867455.62588: variable 'ansible_search_path' from source: unknown 25039 1726867455.62590: variable 'ansible_search_path' from source: unknown 25039 1726867455.62982: calling self._execute() 25039 1726867455.62986: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867455.62989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867455.62991: variable 'omit' from source: magic vars 25039 1726867455.63783: variable 'ansible_distribution_major_version' from source: facts 25039 1726867455.63786: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867455.63789: variable 'network_provider' from source: set_fact 25039 1726867455.63791: Evaluated conditional (network_provider == "initscripts"): False 25039 1726867455.63794: when evaluation is False, skipping this task 25039 1726867455.63796: _execute() done 25039 1726867455.64184: dumping result to json 25039 1726867455.64187: done dumping result, returning 25039 1726867455.64191: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-3ddc-7272-000000000026] 25039 1726867455.64193: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000026 25039 1726867455.64267: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000026 25039 1726867455.64271: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25039 1726867455.64320: no more pending results, returning what we have 25039 1726867455.64324: results queue empty 25039 1726867455.64325: checking for any_errors_fatal 25039 1726867455.64332: done checking for any_errors_fatal 25039 1726867455.64333: checking for max_fail_percentage 25039 1726867455.64335: done checking for max_fail_percentage 25039 1726867455.64336: checking to see if all hosts have failed and the running result is not ok 25039 1726867455.64337: done checking to see if all hosts have failed 25039 1726867455.64338: getting the remaining hosts for this loop 25039 1726867455.64339: done getting the remaining hosts for this loop 25039 1726867455.64342: getting the next task for host managed_node1 25039 1726867455.64348: done getting next task for host managed_node1 25039 1726867455.64353: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25039 1726867455.64355: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867455.64372: getting variables 25039 1726867455.64374: in VariableManager get_vars() 25039 1726867455.64417: Calling all_inventory to load vars for managed_node1 25039 1726867455.64420: Calling groups_inventory to load vars for managed_node1 25039 1726867455.64422: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867455.64435: Calling all_plugins_play to load vars for managed_node1 25039 1726867455.64438: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867455.64441: Calling groups_plugins_play to load vars for managed_node1 25039 1726867455.67176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867455.69791: done with get_vars() 25039 1726867455.69822: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:24:15 -0400 (0:00:00.087) 0:00:13.224 ****** 25039 1726867455.69923: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 25039 1726867455.69925: Creating lock for fedora.linux_system_roles.network_connections 25039 1726867455.70317: worker is 1 (out of 1 available) 25039 1726867455.70331: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 25039 1726867455.70345: done queuing things up, now waiting for results queue to drain 25039 1726867455.70346: waiting for pending results... 25039 1726867455.70636: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25039 1726867455.70783: in run() - task 0affcac9-a3a5-3ddc-7272-000000000027 25039 1726867455.70815: variable 'ansible_search_path' from source: unknown 25039 1726867455.70823: variable 'ansible_search_path' from source: unknown 25039 1726867455.70866: calling self._execute() 25039 1726867455.70973: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867455.70989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867455.71004: variable 'omit' from source: magic vars 25039 1726867455.71397: variable 'ansible_distribution_major_version' from source: facts 25039 1726867455.71415: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867455.71430: variable 'omit' from source: magic vars 25039 1726867455.71498: variable 'omit' from source: magic vars 25039 1726867455.71671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867455.75552: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867455.75639: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867455.75681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867455.75722: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867455.75757: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867455.75844: variable 'network_provider' from source: set_fact 25039 1726867455.75985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867455.76031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867455.76288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867455.76294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867455.76296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867455.76299: variable 'omit' from source: magic vars 25039 1726867455.76320: variable 'omit' from source: magic vars 25039 1726867455.76425: variable 'network_connections' from source: task vars 25039 1726867455.76442: variable 'interface' from source: play vars 25039 1726867455.76512: variable 'interface' from source: play vars 25039 1726867455.76681: variable 'omit' from source: magic vars 25039 1726867455.76751: variable '__lsr_ansible_managed' from source: task vars 25039 1726867455.76959: variable '__lsr_ansible_managed' from source: task vars 25039 1726867455.77423: Loaded config def from plugin (lookup/template) 25039 1726867455.77493: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25039 1726867455.77529: File lookup term: get_ansible_managed.j2 25039 1726867455.77537: variable 'ansible_search_path' from source: unknown 25039 1726867455.77546: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25039 1726867455.77561: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25039 1726867455.77631: variable 'ansible_search_path' from source: unknown 25039 1726867455.84420: variable 'ansible_managed' from source: unknown 25039 1726867455.84557: variable 'omit' from source: magic vars 25039 1726867455.84593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867455.84624: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867455.84767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867455.84792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867455.84808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867455.84840: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867455.84849: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867455.84858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867455.84956: Set connection var ansible_shell_executable to /bin/sh 25039 1726867455.84968: Set connection var ansible_timeout to 10 25039 1726867455.84984: Set connection var ansible_shell_type to sh 25039 1726867455.84992: Set connection var ansible_connection to ssh 25039 1726867455.85005: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867455.85016: Set connection var ansible_pipelining to False 25039 1726867455.85046: variable 'ansible_shell_executable' from source: unknown 25039 1726867455.85055: variable 'ansible_connection' from source: unknown 25039 1726867455.85062: variable 'ansible_module_compression' from source: unknown 25039 1726867455.85068: variable 'ansible_shell_type' from source: unknown 25039 1726867455.85073: variable 'ansible_shell_executable' from source: unknown 25039 1726867455.85084: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867455.85282: variable 'ansible_pipelining' from source: unknown 25039 1726867455.85285: variable 'ansible_timeout' from source: unknown 25039 1726867455.85287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867455.85290: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867455.85299: variable 'omit' from source: magic vars 25039 1726867455.85301: starting attempt loop 25039 1726867455.85304: running the handler 25039 1726867455.85306: _low_level_execute_command(): starting 25039 1726867455.85308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867455.85979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867455.86018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867455.86035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867455.86057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867455.86198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867455.87846: stdout chunk (state=3): >>>/root <<< 25039 1726867455.87971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867455.87989: stdout chunk (state=3): >>><<< 25039 1726867455.88010: stderr chunk (state=3): >>><<< 25039 1726867455.88034: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867455.88052: _low_level_execute_command(): starting 25039 1726867455.88131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747 `" && echo ansible-tmp-1726867455.8804097-25641-278759463299747="` echo /root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747 `" ) && sleep 0' 25039 1726867455.88691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867455.88704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867455.88719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867455.88735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867455.88750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867455.88798: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867455.88873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867455.88893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867455.88994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867455.89063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867455.90934: stdout chunk (state=3): >>>ansible-tmp-1726867455.8804097-25641-278759463299747=/root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747 <<< 25039 1726867455.91046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867455.91166: stderr chunk (state=3): >>><<< 25039 1726867455.91200: stdout chunk (state=3): >>><<< 25039 1726867455.91219: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867455.8804097-25641-278759463299747=/root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867455.91262: variable 'ansible_module_compression' from source: unknown 25039 1726867455.91321: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 25039 1726867455.91325: ANSIBALLZ: Acquiring lock 25039 1726867455.91327: ANSIBALLZ: Lock acquired: 140682436655872 25039 1726867455.91330: ANSIBALLZ: Creating module 25039 1726867456.14746: ANSIBALLZ: Writing module into payload 25039 1726867456.15058: ANSIBALLZ: Writing module 25039 1726867456.15081: ANSIBALLZ: Renaming module 25039 1726867456.15087: ANSIBALLZ: Done creating module 25039 1726867456.15112: variable 'ansible_facts' from source: unknown 25039 1726867456.15222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/AnsiballZ_network_connections.py 25039 1726867456.15372: Sending initial data 25039 1726867456.15375: Sent initial data (168 bytes) 25039 1726867456.16092: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867456.16123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867456.16158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867456.16288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867456.17867: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867456.17879: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867456.17909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867456.17954: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpmdf81lnu /root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/AnsiballZ_network_connections.py <<< 25039 1726867456.17957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/AnsiballZ_network_connections.py" <<< 25039 1726867456.18003: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpmdf81lnu" to remote "/root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/AnsiballZ_network_connections.py" <<< 25039 1726867456.18993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867456.19054: stderr chunk (state=3): >>><<< 25039 1726867456.19058: stdout chunk (state=3): >>><<< 25039 1726867456.19108: done transferring module to remote 25039 1726867456.19117: _low_level_execute_command(): starting 25039 1726867456.19122: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/ /root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/AnsiballZ_network_connections.py && sleep 0' 25039 1726867456.19541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867456.19544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867456.19547: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867456.19549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867456.19595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867456.19598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867456.19650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867456.21469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867456.21472: stdout chunk (state=3): >>><<< 25039 1726867456.21474: stderr chunk (state=3): >>><<< 25039 1726867456.21495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867456.21531: _low_level_execute_command(): starting 25039 1726867456.21572: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/AnsiballZ_network_connections.py && sleep 0' 25039 1726867456.22116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867456.22119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867456.22186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867456.22190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867456.22192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867456.22195: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867456.22197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867456.22199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867456.22212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867456.22222: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25039 1726867456.22225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867456.22234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867456.22242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867456.22249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867456.22256: stderr chunk (state=3): >>>debug2: match found <<< 25039 1726867456.22278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867456.22373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867456.22379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867456.22457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867458.27309: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25039 1726867458.29205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867458.29302: stderr chunk (state=3): >>><<< 25039 1726867458.29306: stdout chunk (state=3): >>><<< 25039 1726867458.29329: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867458.29376: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867458.29527: _low_level_execute_command(): starting 25039 1726867458.29530: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867455.8804097-25641-278759463299747/ > /dev/null 2>&1 && sleep 0' 25039 1726867458.30858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867458.30879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867458.30970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867458.30995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867458.31138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867458.33285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867458.33290: stdout chunk (state=3): >>><<< 25039 1726867458.33292: stderr chunk (state=3): >>><<< 25039 1726867458.33295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867458.33297: handler run complete 25039 1726867458.33304: attempt loop complete, returning result 25039 1726867458.33307: _execute() done 25039 1726867458.33312: dumping result to json 25039 1726867458.33314: done dumping result, returning 25039 1726867458.33316: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-3ddc-7272-000000000027] 25039 1726867458.33318: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000027 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555 (not-active) 25039 1726867458.33807: no more pending results, returning what we have 25039 1726867458.33815: results queue empty 25039 1726867458.33816: checking for any_errors_fatal 25039 1726867458.33824: done checking for any_errors_fatal 25039 1726867458.33825: checking for max_fail_percentage 25039 1726867458.33827: done checking for max_fail_percentage 25039 1726867458.33828: checking to see if all hosts have failed and the running result is not ok 25039 1726867458.33829: done checking to see if all hosts have failed 25039 1726867458.33830: getting the remaining hosts for this loop 25039 1726867458.33831: done getting the remaining hosts for this loop 25039 1726867458.33835: getting the next task for host managed_node1 25039 1726867458.33843: done getting next task for host managed_node1 25039 1726867458.33847: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25039 1726867458.33850: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867458.33862: getting variables 25039 1726867458.33864: in VariableManager get_vars() 25039 1726867458.34213: Calling all_inventory to load vars for managed_node1 25039 1726867458.34216: Calling groups_inventory to load vars for managed_node1 25039 1726867458.34220: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867458.34233: Calling all_plugins_play to load vars for managed_node1 25039 1726867458.34237: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867458.34240: Calling groups_plugins_play to load vars for managed_node1 25039 1726867458.34986: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000027 25039 1726867458.34990: WORKER PROCESS EXITING 25039 1726867458.37159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867458.39080: done with get_vars() 25039 1726867458.39103: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:24:18 -0400 (0:00:02.692) 0:00:15.917 ****** 25039 1726867458.39189: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 25039 1726867458.39191: Creating lock for fedora.linux_system_roles.network_state 25039 1726867458.39524: worker is 1 (out of 1 available) 25039 1726867458.39538: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 25039 1726867458.39551: done queuing things up, now waiting for results queue to drain 25039 1726867458.39552: waiting for pending results... 25039 1726867458.39810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 25039 1726867458.39986: in run() - task 0affcac9-a3a5-3ddc-7272-000000000028 25039 1726867458.40019: variable 'ansible_search_path' from source: unknown 25039 1726867458.40028: variable 'ansible_search_path' from source: unknown 25039 1726867458.40093: calling self._execute() 25039 1726867458.40588: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.40593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.40597: variable 'omit' from source: magic vars 25039 1726867458.41030: variable 'ansible_distribution_major_version' from source: facts 25039 1726867458.41159: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867458.41421: variable 'network_state' from source: role '' defaults 25039 1726867458.41483: Evaluated conditional (network_state != {}): False 25039 1726867458.41499: when evaluation is False, skipping this task 25039 1726867458.41510: _execute() done 25039 1726867458.41519: dumping result to json 25039 1726867458.41526: done dumping result, returning 25039 1726867458.41552: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-3ddc-7272-000000000028] 25039 1726867458.41589: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000028 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867458.41778: no more pending results, returning what we have 25039 1726867458.41783: results queue empty 25039 1726867458.41784: checking for any_errors_fatal 25039 1726867458.41798: done checking for any_errors_fatal 25039 1726867458.41799: checking for max_fail_percentage 25039 1726867458.41801: done checking for max_fail_percentage 25039 1726867458.41802: checking to see if all hosts have failed and the running result is not ok 25039 1726867458.41803: done checking to see if all hosts have failed 25039 1726867458.41804: getting the remaining hosts for this loop 25039 1726867458.41805: done getting the remaining hosts for this loop 25039 1726867458.41811: getting the next task for host managed_node1 25039 1726867458.41819: done getting next task for host managed_node1 25039 1726867458.41831: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25039 1726867458.41836: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867458.41850: getting variables 25039 1726867458.41852: in VariableManager get_vars() 25039 1726867458.41894: Calling all_inventory to load vars for managed_node1 25039 1726867458.41897: Calling groups_inventory to load vars for managed_node1 25039 1726867458.41900: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867458.41920: Calling all_plugins_play to load vars for managed_node1 25039 1726867458.41923: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867458.41926: Calling groups_plugins_play to load vars for managed_node1 25039 1726867458.42771: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000028 25039 1726867458.42775: WORKER PROCESS EXITING 25039 1726867458.43857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867458.45675: done with get_vars() 25039 1726867458.45697: done getting variables 25039 1726867458.45762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:24:18 -0400 (0:00:00.066) 0:00:15.983 ****** 25039 1726867458.45824: entering _queue_task() for managed_node1/debug 25039 1726867458.46166: worker is 1 (out of 1 available) 25039 1726867458.46384: exiting _queue_task() for managed_node1/debug 25039 1726867458.46393: done queuing things up, now waiting for results queue to drain 25039 1726867458.46395: waiting for pending results... 25039 1726867458.46475: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25039 1726867458.46620: in run() - task 0affcac9-a3a5-3ddc-7272-000000000029 25039 1726867458.46739: variable 'ansible_search_path' from source: unknown 25039 1726867458.46742: variable 'ansible_search_path' from source: unknown 25039 1726867458.46746: calling self._execute() 25039 1726867458.46786: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.46800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.46817: variable 'omit' from source: magic vars 25039 1726867458.47199: variable 'ansible_distribution_major_version' from source: facts 25039 1726867458.47218: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867458.47229: variable 'omit' from source: magic vars 25039 1726867458.47294: variable 'omit' from source: magic vars 25039 1726867458.47336: variable 'omit' from source: magic vars 25039 1726867458.47405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867458.47462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867458.47517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867458.47621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867458.47625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867458.47627: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867458.47630: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.47632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.48087: Set connection var ansible_shell_executable to /bin/sh 25039 1726867458.48090: Set connection var ansible_timeout to 10 25039 1726867458.48093: Set connection var ansible_shell_type to sh 25039 1726867458.48095: Set connection var ansible_connection to ssh 25039 1726867458.48098: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867458.48100: Set connection var ansible_pipelining to False 25039 1726867458.48102: variable 'ansible_shell_executable' from source: unknown 25039 1726867458.48104: variable 'ansible_connection' from source: unknown 25039 1726867458.48107: variable 'ansible_module_compression' from source: unknown 25039 1726867458.48112: variable 'ansible_shell_type' from source: unknown 25039 1726867458.48114: variable 'ansible_shell_executable' from source: unknown 25039 1726867458.48116: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.48118: variable 'ansible_pipelining' from source: unknown 25039 1726867458.48122: variable 'ansible_timeout' from source: unknown 25039 1726867458.48132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.48397: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867458.48425: variable 'omit' from source: magic vars 25039 1726867458.48490: starting attempt loop 25039 1726867458.48497: running the handler 25039 1726867458.48682: variable '__network_connections_result' from source: set_fact 25039 1726867458.48985: handler run complete 25039 1726867458.48989: attempt loop complete, returning result 25039 1726867458.48992: _execute() done 25039 1726867458.48994: dumping result to json 25039 1726867458.48996: done dumping result, returning 25039 1726867458.48999: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-3ddc-7272-000000000029] 25039 1726867458.49001: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000029 25039 1726867458.49061: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000029 25039 1726867458.49064: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555 (not-active)" ] } 25039 1726867458.49134: no more pending results, returning what we have 25039 1726867458.49138: results queue empty 25039 1726867458.49139: checking for any_errors_fatal 25039 1726867458.49145: done checking for any_errors_fatal 25039 1726867458.49145: checking for max_fail_percentage 25039 1726867458.49147: done checking for max_fail_percentage 25039 1726867458.49148: checking to see if all hosts have failed and the running result is not ok 25039 1726867458.49149: done checking to see if all hosts have failed 25039 1726867458.49150: getting the remaining hosts for this loop 25039 1726867458.49151: done getting the remaining hosts for this loop 25039 1726867458.49155: getting the next task for host managed_node1 25039 1726867458.49162: done getting next task for host managed_node1 25039 1726867458.49166: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25039 1726867458.49169: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867458.49181: getting variables 25039 1726867458.49184: in VariableManager get_vars() 25039 1726867458.49222: Calling all_inventory to load vars for managed_node1 25039 1726867458.49226: Calling groups_inventory to load vars for managed_node1 25039 1726867458.49228: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867458.49239: Calling all_plugins_play to load vars for managed_node1 25039 1726867458.49242: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867458.49245: Calling groups_plugins_play to load vars for managed_node1 25039 1726867458.51942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867458.56701: done with get_vars() 25039 1726867458.56728: done getting variables 25039 1726867458.56790: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:24:18 -0400 (0:00:00.109) 0:00:16.093 ****** 25039 1726867458.56830: entering _queue_task() for managed_node1/debug 25039 1726867458.57381: worker is 1 (out of 1 available) 25039 1726867458.57389: exiting _queue_task() for managed_node1/debug 25039 1726867458.57398: done queuing things up, now waiting for results queue to drain 25039 1726867458.57399: waiting for pending results... 25039 1726867458.57463: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25039 1726867458.57628: in run() - task 0affcac9-a3a5-3ddc-7272-00000000002a 25039 1726867458.57633: variable 'ansible_search_path' from source: unknown 25039 1726867458.57635: variable 'ansible_search_path' from source: unknown 25039 1726867458.57673: calling self._execute() 25039 1726867458.57782: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.57796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.57814: variable 'omit' from source: magic vars 25039 1726867458.58283: variable 'ansible_distribution_major_version' from source: facts 25039 1726867458.58287: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867458.58289: variable 'omit' from source: magic vars 25039 1726867458.58329: variable 'omit' from source: magic vars 25039 1726867458.58369: variable 'omit' from source: magic vars 25039 1726867458.58615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867458.58619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867458.58621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867458.58629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867458.58631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867458.58726: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867458.58735: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.58742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.58854: Set connection var ansible_shell_executable to /bin/sh 25039 1726867458.58866: Set connection var ansible_timeout to 10 25039 1726867458.58875: Set connection var ansible_shell_type to sh 25039 1726867458.58885: Set connection var ansible_connection to ssh 25039 1726867458.58897: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867458.58929: Set connection var ansible_pipelining to False 25039 1726867458.58939: variable 'ansible_shell_executable' from source: unknown 25039 1726867458.58946: variable 'ansible_connection' from source: unknown 25039 1726867458.58952: variable 'ansible_module_compression' from source: unknown 25039 1726867458.58959: variable 'ansible_shell_type' from source: unknown 25039 1726867458.59039: variable 'ansible_shell_executable' from source: unknown 25039 1726867458.59042: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.59044: variable 'ansible_pipelining' from source: unknown 25039 1726867458.59046: variable 'ansible_timeout' from source: unknown 25039 1726867458.59048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.59367: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867458.59475: variable 'omit' from source: magic vars 25039 1726867458.59480: starting attempt loop 25039 1726867458.59483: running the handler 25039 1726867458.59485: variable '__network_connections_result' from source: set_fact 25039 1726867458.59569: variable '__network_connections_result' from source: set_fact 25039 1726867458.59759: handler run complete 25039 1726867458.59930: attempt loop complete, returning result 25039 1726867458.59976: _execute() done 25039 1726867458.59987: dumping result to json 25039 1726867458.59996: done dumping result, returning 25039 1726867458.60009: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-3ddc-7272-00000000002a] 25039 1726867458.60020: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000002a 25039 1726867458.60209: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000002a 25039 1726867458.60213: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 05d68ca3-7a29-47b4-8db1-5de4d05c6555 (not-active)" ] } } 25039 1726867458.60339: no more pending results, returning what we have 25039 1726867458.60343: results queue empty 25039 1726867458.60345: checking for any_errors_fatal 25039 1726867458.60464: done checking for any_errors_fatal 25039 1726867458.60466: checking for max_fail_percentage 25039 1726867458.60467: done checking for max_fail_percentage 25039 1726867458.60468: checking to see if all hosts have failed and the running result is not ok 25039 1726867458.60469: done checking to see if all hosts have failed 25039 1726867458.60470: getting the remaining hosts for this loop 25039 1726867458.60472: done getting the remaining hosts for this loop 25039 1726867458.60475: getting the next task for host managed_node1 25039 1726867458.60483: done getting next task for host managed_node1 25039 1726867458.60486: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25039 1726867458.60489: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867458.60501: getting variables 25039 1726867458.60502: in VariableManager get_vars() 25039 1726867458.60540: Calling all_inventory to load vars for managed_node1 25039 1726867458.60547: Calling groups_inventory to load vars for managed_node1 25039 1726867458.60550: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867458.60561: Calling all_plugins_play to load vars for managed_node1 25039 1726867458.60564: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867458.60567: Calling groups_plugins_play to load vars for managed_node1 25039 1726867458.62120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867458.64696: done with get_vars() 25039 1726867458.64719: done getting variables 25039 1726867458.64781: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:24:18 -0400 (0:00:00.080) 0:00:16.174 ****** 25039 1726867458.64906: entering _queue_task() for managed_node1/debug 25039 1726867458.65517: worker is 1 (out of 1 available) 25039 1726867458.65530: exiting _queue_task() for managed_node1/debug 25039 1726867458.65542: done queuing things up, now waiting for results queue to drain 25039 1726867458.65543: waiting for pending results... 25039 1726867458.66122: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25039 1726867458.66216: in run() - task 0affcac9-a3a5-3ddc-7272-00000000002b 25039 1726867458.66223: variable 'ansible_search_path' from source: unknown 25039 1726867458.66390: variable 'ansible_search_path' from source: unknown 25039 1726867458.66427: calling self._execute() 25039 1726867458.66537: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.66541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.66548: variable 'omit' from source: magic vars 25039 1726867458.67273: variable 'ansible_distribution_major_version' from source: facts 25039 1726867458.67286: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867458.67520: variable 'network_state' from source: role '' defaults 25039 1726867458.67527: Evaluated conditional (network_state != {}): False 25039 1726867458.67530: when evaluation is False, skipping this task 25039 1726867458.67533: _execute() done 25039 1726867458.67535: dumping result to json 25039 1726867458.67537: done dumping result, returning 25039 1726867458.67540: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-3ddc-7272-00000000002b] 25039 1726867458.67542: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000002b 25039 1726867458.67756: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000002b 25039 1726867458.67760: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 25039 1726867458.67813: no more pending results, returning what we have 25039 1726867458.67817: results queue empty 25039 1726867458.67818: checking for any_errors_fatal 25039 1726867458.67830: done checking for any_errors_fatal 25039 1726867458.67831: checking for max_fail_percentage 25039 1726867458.67834: done checking for max_fail_percentage 25039 1726867458.67835: checking to see if all hosts have failed and the running result is not ok 25039 1726867458.67841: done checking to see if all hosts have failed 25039 1726867458.67842: getting the remaining hosts for this loop 25039 1726867458.67844: done getting the remaining hosts for this loop 25039 1726867458.67847: getting the next task for host managed_node1 25039 1726867458.67855: done getting next task for host managed_node1 25039 1726867458.67859: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25039 1726867458.67862: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867458.67879: getting variables 25039 1726867458.67881: in VariableManager get_vars() 25039 1726867458.67924: Calling all_inventory to load vars for managed_node1 25039 1726867458.67927: Calling groups_inventory to load vars for managed_node1 25039 1726867458.67930: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867458.67944: Calling all_plugins_play to load vars for managed_node1 25039 1726867458.68085: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867458.68089: Calling groups_plugins_play to load vars for managed_node1 25039 1726867458.71096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867458.74448: done with get_vars() 25039 1726867458.74471: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:24:18 -0400 (0:00:00.096) 0:00:16.271 ****** 25039 1726867458.74574: entering _queue_task() for managed_node1/ping 25039 1726867458.74576: Creating lock for ping 25039 1726867458.75316: worker is 1 (out of 1 available) 25039 1726867458.75328: exiting _queue_task() for managed_node1/ping 25039 1726867458.75340: done queuing things up, now waiting for results queue to drain 25039 1726867458.75341: waiting for pending results... 25039 1726867458.76000: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 25039 1726867458.76035: in run() - task 0affcac9-a3a5-3ddc-7272-00000000002c 25039 1726867458.76049: variable 'ansible_search_path' from source: unknown 25039 1726867458.76052: variable 'ansible_search_path' from source: unknown 25039 1726867458.76290: calling self._execute() 25039 1726867458.76376: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.76440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.76443: variable 'omit' from source: magic vars 25039 1726867458.77134: variable 'ansible_distribution_major_version' from source: facts 25039 1726867458.77144: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867458.77151: variable 'omit' from source: magic vars 25039 1726867458.77204: variable 'omit' from source: magic vars 25039 1726867458.77237: variable 'omit' from source: magic vars 25039 1726867458.77276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867458.77530: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867458.77534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867458.77547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867458.77639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867458.77642: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867458.77645: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.77647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.77810: Set connection var ansible_shell_executable to /bin/sh 25039 1726867458.77814: Set connection var ansible_timeout to 10 25039 1726867458.77819: Set connection var ansible_shell_type to sh 25039 1726867458.77822: Set connection var ansible_connection to ssh 25039 1726867458.77829: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867458.77834: Set connection var ansible_pipelining to False 25039 1726867458.77859: variable 'ansible_shell_executable' from source: unknown 25039 1726867458.77862: variable 'ansible_connection' from source: unknown 25039 1726867458.77865: variable 'ansible_module_compression' from source: unknown 25039 1726867458.77868: variable 'ansible_shell_type' from source: unknown 25039 1726867458.77870: variable 'ansible_shell_executable' from source: unknown 25039 1726867458.77872: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867458.77875: variable 'ansible_pipelining' from source: unknown 25039 1726867458.77879: variable 'ansible_timeout' from source: unknown 25039 1726867458.77884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867458.78375: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867458.78401: variable 'omit' from source: magic vars 25039 1726867458.78404: starting attempt loop 25039 1726867458.78407: running the handler 25039 1726867458.78412: _low_level_execute_command(): starting 25039 1726867458.78415: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867458.80085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867458.80088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867458.80091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867458.80094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867458.80097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867458.80099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867458.80101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867458.80103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867458.80227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867458.80275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867458.81969: stdout chunk (state=3): >>>/root <<< 25039 1726867458.82147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867458.82150: stderr chunk (state=3): >>><<< 25039 1726867458.82163: stdout chunk (state=3): >>><<< 25039 1726867458.82300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867458.82313: _low_level_execute_command(): starting 25039 1726867458.82320: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249 `" && echo ansible-tmp-1726867458.822993-25782-89978048194249="` echo /root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249 `" ) && sleep 0' 25039 1726867458.83549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867458.83641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867458.83644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867458.83647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867458.85714: stdout chunk (state=3): >>>ansible-tmp-1726867458.822993-25782-89978048194249=/root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249 <<< 25039 1726867458.85723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867458.85764: stderr chunk (state=3): >>><<< 25039 1726867458.86384: stdout chunk (state=3): >>><<< 25039 1726867458.86388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867458.822993-25782-89978048194249=/root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867458.86391: variable 'ansible_module_compression' from source: unknown 25039 1726867458.86393: ANSIBALLZ: Using lock for ping 25039 1726867458.86395: ANSIBALLZ: Acquiring lock 25039 1726867458.86397: ANSIBALLZ: Lock acquired: 140682438470368 25039 1726867458.86399: ANSIBALLZ: Creating module 25039 1726867459.10834: ANSIBALLZ: Writing module into payload 25039 1726867459.10897: ANSIBALLZ: Writing module 25039 1726867459.10918: ANSIBALLZ: Renaming module 25039 1726867459.10924: ANSIBALLZ: Done creating module 25039 1726867459.11002: variable 'ansible_facts' from source: unknown 25039 1726867459.11180: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/AnsiballZ_ping.py 25039 1726867459.11419: Sending initial data 25039 1726867459.11423: Sent initial data (151 bytes) 25039 1726867459.12895: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867459.12925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867459.12939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867459.12947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867459.13027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867459.14668: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25039 1726867459.14676: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 25039 1726867459.14685: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 25039 1726867459.14692: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 25039 1726867459.14700: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 25039 1726867459.14706: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 25039 1726867459.14714: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 25039 1726867459.14818: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867459.14822: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867459.14840: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp5ff_8t19 /root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/AnsiballZ_ping.py <<< 25039 1726867459.14843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/AnsiballZ_ping.py" <<< 25039 1726867459.15007: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp5ff_8t19" to remote "/root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/AnsiballZ_ping.py" <<< 25039 1726867459.16879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867459.16883: stdout chunk (state=3): >>><<< 25039 1726867459.16886: stderr chunk (state=3): >>><<< 25039 1726867459.16888: done transferring module to remote 25039 1726867459.16890: _low_level_execute_command(): starting 25039 1726867459.16893: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/ /root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/AnsiballZ_ping.py && sleep 0' 25039 1726867459.17984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867459.18039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867459.18050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867459.18237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867459.18294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867459.18402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867459.20268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867459.20272: stdout chunk (state=3): >>><<< 25039 1726867459.20274: stderr chunk (state=3): >>><<< 25039 1726867459.20279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867459.20282: _low_level_execute_command(): starting 25039 1726867459.20284: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/AnsiballZ_ping.py && sleep 0' 25039 1726867459.21989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867459.21993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867459.22387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867459.22547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867459.37419: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25039 1726867459.38712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867459.38764: stderr chunk (state=3): >>><<< 25039 1726867459.38816: stdout chunk (state=3): >>><<< 25039 1726867459.38837: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867459.38862: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867459.38874: _low_level_execute_command(): starting 25039 1726867459.38879: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867458.822993-25782-89978048194249/ > /dev/null 2>&1 && sleep 0' 25039 1726867459.40161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867459.40383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867459.40521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867459.40564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867459.42507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867459.42512: stdout chunk (state=3): >>><<< 25039 1726867459.42515: stderr chunk (state=3): >>><<< 25039 1726867459.42522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867459.42528: handler run complete 25039 1726867459.42530: attempt loop complete, returning result 25039 1726867459.42531: _execute() done 25039 1726867459.42533: dumping result to json 25039 1726867459.42535: done dumping result, returning 25039 1726867459.42537: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-3ddc-7272-00000000002c] 25039 1726867459.42539: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000002c ok: [managed_node1] => { "changed": false, "ping": "pong" } 25039 1726867459.42683: no more pending results, returning what we have 25039 1726867459.42687: results queue empty 25039 1726867459.42688: checking for any_errors_fatal 25039 1726867459.42694: done checking for any_errors_fatal 25039 1726867459.42695: checking for max_fail_percentage 25039 1726867459.42697: done checking for max_fail_percentage 25039 1726867459.42698: checking to see if all hosts have failed and the running result is not ok 25039 1726867459.42699: done checking to see if all hosts have failed 25039 1726867459.42699: getting the remaining hosts for this loop 25039 1726867459.42702: done getting the remaining hosts for this loop 25039 1726867459.42705: getting the next task for host managed_node1 25039 1726867459.42716: done getting next task for host managed_node1 25039 1726867459.42719: ^ task is: TASK: meta (role_complete) 25039 1726867459.42722: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867459.42738: getting variables 25039 1726867459.42740: in VariableManager get_vars() 25039 1726867459.42782: Calling all_inventory to load vars for managed_node1 25039 1726867459.42785: Calling groups_inventory to load vars for managed_node1 25039 1726867459.42788: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867459.42800: Calling all_plugins_play to load vars for managed_node1 25039 1726867459.42804: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867459.42808: Calling groups_plugins_play to load vars for managed_node1 25039 1726867459.43691: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000002c 25039 1726867459.43694: WORKER PROCESS EXITING 25039 1726867459.45694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867459.49070: done with get_vars() 25039 1726867459.49101: done getting variables 25039 1726867459.49311: done queuing things up, now waiting for results queue to drain 25039 1726867459.49314: results queue empty 25039 1726867459.49315: checking for any_errors_fatal 25039 1726867459.49318: done checking for any_errors_fatal 25039 1726867459.49318: checking for max_fail_percentage 25039 1726867459.49319: done checking for max_fail_percentage 25039 1726867459.49320: checking to see if all hosts have failed and the running result is not ok 25039 1726867459.49321: done checking to see if all hosts have failed 25039 1726867459.49321: getting the remaining hosts for this loop 25039 1726867459.49322: done getting the remaining hosts for this loop 25039 1726867459.49325: getting the next task for host managed_node1 25039 1726867459.49329: done getting next task for host managed_node1 25039 1726867459.49332: ^ task is: TASK: Include the task 'assert_device_present.yml' 25039 1726867459.49333: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867459.49336: getting variables 25039 1726867459.49336: in VariableManager get_vars() 25039 1726867459.49351: Calling all_inventory to load vars for managed_node1 25039 1726867459.49353: Calling groups_inventory to load vars for managed_node1 25039 1726867459.49355: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867459.49360: Calling all_plugins_play to load vars for managed_node1 25039 1726867459.49362: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867459.49365: Calling groups_plugins_play to load vars for managed_node1 25039 1726867459.52048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867459.54946: done with get_vars() 25039 1726867459.54970: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Friday 20 September 2024 17:24:19 -0400 (0:00:00.806) 0:00:17.078 ****** 25039 1726867459.55251: entering _queue_task() for managed_node1/include_tasks 25039 1726867459.56052: worker is 1 (out of 1 available) 25039 1726867459.56065: exiting _queue_task() for managed_node1/include_tasks 25039 1726867459.56080: done queuing things up, now waiting for results queue to drain 25039 1726867459.56081: waiting for pending results... 25039 1726867459.56565: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 25039 1726867459.56845: in run() - task 0affcac9-a3a5-3ddc-7272-00000000005c 25039 1726867459.56903: variable 'ansible_search_path' from source: unknown 25039 1726867459.56910: calling self._execute() 25039 1726867459.56983: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867459.56990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867459.57229: variable 'omit' from source: magic vars 25039 1726867459.57987: variable 'ansible_distribution_major_version' from source: facts 25039 1726867459.57991: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867459.57994: _execute() done 25039 1726867459.57996: dumping result to json 25039 1726867459.57999: done dumping result, returning 25039 1726867459.58002: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0affcac9-a3a5-3ddc-7272-00000000005c] 25039 1726867459.58059: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000005c 25039 1726867459.58155: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000005c 25039 1726867459.58158: WORKER PROCESS EXITING 25039 1726867459.58406: no more pending results, returning what we have 25039 1726867459.58412: in VariableManager get_vars() 25039 1726867459.58462: Calling all_inventory to load vars for managed_node1 25039 1726867459.58466: Calling groups_inventory to load vars for managed_node1 25039 1726867459.58468: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867459.58485: Calling all_plugins_play to load vars for managed_node1 25039 1726867459.58488: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867459.58492: Calling groups_plugins_play to load vars for managed_node1 25039 1726867459.61197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867459.64686: done with get_vars() 25039 1726867459.64781: variable 'ansible_search_path' from source: unknown 25039 1726867459.64797: we have included files to process 25039 1726867459.64799: generating all_blocks data 25039 1726867459.64801: done generating all_blocks data 25039 1726867459.64806: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25039 1726867459.64807: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25039 1726867459.64810: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25039 1726867459.65093: in VariableManager get_vars() 25039 1726867459.65116: done with get_vars() 25039 1726867459.65340: done processing included file 25039 1726867459.65342: iterating over new_blocks loaded from include file 25039 1726867459.65344: in VariableManager get_vars() 25039 1726867459.65479: done with get_vars() 25039 1726867459.65481: filtering new block on tags 25039 1726867459.65501: done filtering new block on tags 25039 1726867459.65504: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 25039 1726867459.65509: extending task lists for all hosts with included blocks 25039 1726867459.69971: done extending task lists 25039 1726867459.69972: done processing included files 25039 1726867459.69973: results queue empty 25039 1726867459.69974: checking for any_errors_fatal 25039 1726867459.69976: done checking for any_errors_fatal 25039 1726867459.69978: checking for max_fail_percentage 25039 1726867459.70036: done checking for max_fail_percentage 25039 1726867459.70037: checking to see if all hosts have failed and the running result is not ok 25039 1726867459.70039: done checking to see if all hosts have failed 25039 1726867459.70039: getting the remaining hosts for this loop 25039 1726867459.70041: done getting the remaining hosts for this loop 25039 1726867459.70044: getting the next task for host managed_node1 25039 1726867459.70048: done getting next task for host managed_node1 25039 1726867459.70050: ^ task is: TASK: Include the task 'get_interface_stat.yml' 25039 1726867459.70054: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867459.70056: getting variables 25039 1726867459.70057: in VariableManager get_vars() 25039 1726867459.70074: Calling all_inventory to load vars for managed_node1 25039 1726867459.70079: Calling groups_inventory to load vars for managed_node1 25039 1726867459.70081: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867459.70087: Calling all_plugins_play to load vars for managed_node1 25039 1726867459.70089: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867459.70092: Calling groups_plugins_play to load vars for managed_node1 25039 1726867459.72528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867459.75634: done with get_vars() 25039 1726867459.75662: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:24:19 -0400 (0:00:00.206) 0:00:17.284 ****** 25039 1726867459.75863: entering _queue_task() for managed_node1/include_tasks 25039 1726867459.76821: worker is 1 (out of 1 available) 25039 1726867459.76830: exiting _queue_task() for managed_node1/include_tasks 25039 1726867459.76840: done queuing things up, now waiting for results queue to drain 25039 1726867459.76842: waiting for pending results... 25039 1726867459.77567: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 25039 1726867459.77573: in run() - task 0affcac9-a3a5-3ddc-7272-0000000002b5 25039 1726867459.77576: variable 'ansible_search_path' from source: unknown 25039 1726867459.77580: variable 'ansible_search_path' from source: unknown 25039 1726867459.77583: calling self._execute() 25039 1726867459.77643: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867459.77646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867459.77799: variable 'omit' from source: magic vars 25039 1726867459.78572: variable 'ansible_distribution_major_version' from source: facts 25039 1726867459.78586: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867459.78593: _execute() done 25039 1726867459.78685: dumping result to json 25039 1726867459.78689: done dumping result, returning 25039 1726867459.78693: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-3ddc-7272-0000000002b5] 25039 1726867459.78696: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000002b5 25039 1726867459.78810: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000002b5 25039 1726867459.78929: WORKER PROCESS EXITING 25039 1726867459.78961: no more pending results, returning what we have 25039 1726867459.78967: in VariableManager get_vars() 25039 1726867459.79018: Calling all_inventory to load vars for managed_node1 25039 1726867459.79021: Calling groups_inventory to load vars for managed_node1 25039 1726867459.79023: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867459.79038: Calling all_plugins_play to load vars for managed_node1 25039 1726867459.79042: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867459.79045: Calling groups_plugins_play to load vars for managed_node1 25039 1726867459.82084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867459.85339: done with get_vars() 25039 1726867459.85478: variable 'ansible_search_path' from source: unknown 25039 1726867459.85480: variable 'ansible_search_path' from source: unknown 25039 1726867459.85519: we have included files to process 25039 1726867459.85520: generating all_blocks data 25039 1726867459.85522: done generating all_blocks data 25039 1726867459.85523: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25039 1726867459.85524: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25039 1726867459.85526: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25039 1726867459.85997: done processing included file 25039 1726867459.85999: iterating over new_blocks loaded from include file 25039 1726867459.86001: in VariableManager get_vars() 25039 1726867459.86091: done with get_vars() 25039 1726867459.86094: filtering new block on tags 25039 1726867459.86114: done filtering new block on tags 25039 1726867459.86116: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 25039 1726867459.86235: extending task lists for all hosts with included blocks 25039 1726867459.86483: done extending task lists 25039 1726867459.86484: done processing included files 25039 1726867459.86485: results queue empty 25039 1726867459.86486: checking for any_errors_fatal 25039 1726867459.86489: done checking for any_errors_fatal 25039 1726867459.86490: checking for max_fail_percentage 25039 1726867459.86491: done checking for max_fail_percentage 25039 1726867459.86491: checking to see if all hosts have failed and the running result is not ok 25039 1726867459.86492: done checking to see if all hosts have failed 25039 1726867459.86493: getting the remaining hosts for this loop 25039 1726867459.86494: done getting the remaining hosts for this loop 25039 1726867459.86497: getting the next task for host managed_node1 25039 1726867459.86501: done getting next task for host managed_node1 25039 1726867459.86503: ^ task is: TASK: Get stat for interface {{ interface }} 25039 1726867459.86506: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867459.86508: getting variables 25039 1726867459.86509: in VariableManager get_vars() 25039 1726867459.86522: Calling all_inventory to load vars for managed_node1 25039 1726867459.86525: Calling groups_inventory to load vars for managed_node1 25039 1726867459.86527: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867459.86532: Calling all_plugins_play to load vars for managed_node1 25039 1726867459.86534: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867459.86537: Calling groups_plugins_play to load vars for managed_node1 25039 1726867459.88921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867459.92039: done with get_vars() 25039 1726867459.92174: done getting variables 25039 1726867459.92444: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:24:19 -0400 (0:00:00.166) 0:00:17.450 ****** 25039 1726867459.92474: entering _queue_task() for managed_node1/stat 25039 1726867459.93141: worker is 1 (out of 1 available) 25039 1726867459.93380: exiting _queue_task() for managed_node1/stat 25039 1726867459.93392: done queuing things up, now waiting for results queue to drain 25039 1726867459.93393: waiting for pending results... 25039 1726867459.93949: running TaskExecutor() for managed_node1/TASK: Get stat for interface veth0 25039 1726867459.93956: in run() - task 0affcac9-a3a5-3ddc-7272-0000000003a0 25039 1726867459.93984: variable 'ansible_search_path' from source: unknown 25039 1726867459.93988: variable 'ansible_search_path' from source: unknown 25039 1726867459.94092: calling self._execute() 25039 1726867459.94207: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867459.94216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867459.94225: variable 'omit' from source: magic vars 25039 1726867459.95160: variable 'ansible_distribution_major_version' from source: facts 25039 1726867459.95164: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867459.95166: variable 'omit' from source: magic vars 25039 1726867459.95208: variable 'omit' from source: magic vars 25039 1726867459.95407: variable 'interface' from source: play vars 25039 1726867459.95429: variable 'omit' from source: magic vars 25039 1726867459.95592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867459.95627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867459.95647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867459.95782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867459.95794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867459.95832: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867459.95835: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867459.95837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867459.96061: Set connection var ansible_shell_executable to /bin/sh 25039 1726867459.96068: Set connection var ansible_timeout to 10 25039 1726867459.96074: Set connection var ansible_shell_type to sh 25039 1726867459.96079: Set connection var ansible_connection to ssh 25039 1726867459.96194: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867459.96202: Set connection var ansible_pipelining to False 25039 1726867459.96228: variable 'ansible_shell_executable' from source: unknown 25039 1726867459.96240: variable 'ansible_connection' from source: unknown 25039 1726867459.96247: variable 'ansible_module_compression' from source: unknown 25039 1726867459.96249: variable 'ansible_shell_type' from source: unknown 25039 1726867459.96252: variable 'ansible_shell_executable' from source: unknown 25039 1726867459.96254: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867459.96256: variable 'ansible_pipelining' from source: unknown 25039 1726867459.96258: variable 'ansible_timeout' from source: unknown 25039 1726867459.96260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867459.96680: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867459.96684: variable 'omit' from source: magic vars 25039 1726867459.96687: starting attempt loop 25039 1726867459.96689: running the handler 25039 1726867459.96691: _low_level_execute_command(): starting 25039 1726867459.96787: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867459.98114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867459.98118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867459.98285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867459.98290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867459.98306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867459.98389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.00090: stdout chunk (state=3): >>>/root <<< 25039 1726867460.00218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867460.00221: stderr chunk (state=3): >>><<< 25039 1726867460.00227: stdout chunk (state=3): >>><<< 25039 1726867460.00260: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867460.00274: _low_level_execute_command(): starting 25039 1726867460.00286: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739 `" && echo ansible-tmp-1726867460.0025992-25812-25847775598739="` echo /root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739 `" ) && sleep 0' 25039 1726867460.01463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867460.01611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867460.01615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867460.01617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.01761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.03573: stdout chunk (state=3): >>>ansible-tmp-1726867460.0025992-25812-25847775598739=/root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739 <<< 25039 1726867460.03691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867460.03734: stderr chunk (state=3): >>><<< 25039 1726867460.03737: stdout chunk (state=3): >>><<< 25039 1726867460.03754: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867460.0025992-25812-25847775598739=/root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867460.03799: variable 'ansible_module_compression' from source: unknown 25039 1726867460.03857: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25039 1726867460.04100: variable 'ansible_facts' from source: unknown 25039 1726867460.04373: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/AnsiballZ_stat.py 25039 1726867460.04497: Sending initial data 25039 1726867460.04501: Sent initial data (152 bytes) 25039 1726867460.05449: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867460.05453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867460.05484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867460.05491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867460.05496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867460.05623: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.05665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.07186: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867460.07273: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/AnsiballZ_stat.py" <<< 25039 1726867460.07322: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp7ngr35vf /root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/AnsiballZ_stat.py <<< 25039 1726867460.07486: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp7ngr35vf" to remote "/root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/AnsiballZ_stat.py" <<< 25039 1726867460.08595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867460.08611: stderr chunk (state=3): >>><<< 25039 1726867460.08616: stdout chunk (state=3): >>><<< 25039 1726867460.08692: done transferring module to remote 25039 1726867460.08697: _low_level_execute_command(): starting 25039 1726867460.08707: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/ /root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/AnsiballZ_stat.py && sleep 0' 25039 1726867460.09273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867460.09383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867460.09393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867460.09397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867460.09413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.09487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.11695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867460.11699: stdout chunk (state=3): >>><<< 25039 1726867460.11706: stderr chunk (state=3): >>><<< 25039 1726867460.11712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867460.11714: _low_level_execute_command(): starting 25039 1726867460.11716: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/AnsiballZ_stat.py && sleep 0' 25039 1726867460.13302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867460.13600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.13682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.28889: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31060, "dev": 23, "nlink": 1, "atime": 1726867448.7071328, "mtime": 1726867448.7071328, "ctime": 1726867448.7071328, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25039 1726867460.30250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867460.30254: stdout chunk (state=3): >>><<< 25039 1726867460.30257: stderr chunk (state=3): >>><<< 25039 1726867460.30292: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31060, "dev": 23, "nlink": 1, "atime": 1726867448.7071328, "mtime": 1726867448.7071328, "ctime": 1726867448.7071328, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867460.30390: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867460.30457: _low_level_execute_command(): starting 25039 1726867460.30474: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867460.0025992-25812-25847775598739/ > /dev/null 2>&1 && sleep 0' 25039 1726867460.31884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867460.32209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867460.32404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867460.32457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.32502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.34687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867460.34690: stdout chunk (state=3): >>><<< 25039 1726867460.34692: stderr chunk (state=3): >>><<< 25039 1726867460.34696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867460.34699: handler run complete 25039 1726867460.34701: attempt loop complete, returning result 25039 1726867460.34703: _execute() done 25039 1726867460.34705: dumping result to json 25039 1726867460.34707: done dumping result, returning 25039 1726867460.34712: done running TaskExecutor() for managed_node1/TASK: Get stat for interface veth0 [0affcac9-a3a5-3ddc-7272-0000000003a0] 25039 1726867460.34714: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003a0 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726867448.7071328, "block_size": 4096, "blocks": 0, "ctime": 1726867448.7071328, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31060, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726867448.7071328, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 25039 1726867460.34993: no more pending results, returning what we have 25039 1726867460.34998: results queue empty 25039 1726867460.35001: checking for any_errors_fatal 25039 1726867460.35003: done checking for any_errors_fatal 25039 1726867460.35004: checking for max_fail_percentage 25039 1726867460.35005: done checking for max_fail_percentage 25039 1726867460.35006: checking to see if all hosts have failed and the running result is not ok 25039 1726867460.35009: done checking to see if all hosts have failed 25039 1726867460.35009: getting the remaining hosts for this loop 25039 1726867460.35011: done getting the remaining hosts for this loop 25039 1726867460.35014: getting the next task for host managed_node1 25039 1726867460.35023: done getting next task for host managed_node1 25039 1726867460.35028: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 25039 1726867460.35031: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867460.35035: getting variables 25039 1726867460.35036: in VariableManager get_vars() 25039 1726867460.35375: Calling all_inventory to load vars for managed_node1 25039 1726867460.35402: Calling groups_inventory to load vars for managed_node1 25039 1726867460.35406: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867460.35428: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003a0 25039 1726867460.35431: WORKER PROCESS EXITING 25039 1726867460.35450: Calling all_plugins_play to load vars for managed_node1 25039 1726867460.35454: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867460.35458: Calling groups_plugins_play to load vars for managed_node1 25039 1726867460.47421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867460.49385: done with get_vars() 25039 1726867460.49416: done getting variables 25039 1726867460.49527: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 25039 1726867460.49655: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:24:20 -0400 (0:00:00.572) 0:00:18.022 ****** 25039 1726867460.49680: entering _queue_task() for managed_node1/assert 25039 1726867460.49681: Creating lock for assert 25039 1726867460.50155: worker is 1 (out of 1 available) 25039 1726867460.50167: exiting _queue_task() for managed_node1/assert 25039 1726867460.50182: done queuing things up, now waiting for results queue to drain 25039 1726867460.50184: waiting for pending results... 25039 1726867460.50471: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'veth0' 25039 1726867460.50654: in run() - task 0affcac9-a3a5-3ddc-7272-0000000002b6 25039 1726867460.50661: variable 'ansible_search_path' from source: unknown 25039 1726867460.50664: variable 'ansible_search_path' from source: unknown 25039 1726867460.50667: calling self._execute() 25039 1726867460.50775: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.50783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.50798: variable 'omit' from source: magic vars 25039 1726867460.51460: variable 'ansible_distribution_major_version' from source: facts 25039 1726867460.51463: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867460.51467: variable 'omit' from source: magic vars 25039 1726867460.51470: variable 'omit' from source: magic vars 25039 1726867460.51787: variable 'interface' from source: play vars 25039 1726867460.51885: variable 'omit' from source: magic vars 25039 1726867460.51888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867460.51910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867460.51940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867460.51971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867460.51996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867460.52039: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867460.52047: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.52056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.52189: Set connection var ansible_shell_executable to /bin/sh 25039 1726867460.52219: Set connection var ansible_timeout to 10 25039 1726867460.52243: Set connection var ansible_shell_type to sh 25039 1726867460.52324: Set connection var ansible_connection to ssh 25039 1726867460.52328: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867460.52331: Set connection var ansible_pipelining to False 25039 1726867460.52334: variable 'ansible_shell_executable' from source: unknown 25039 1726867460.52337: variable 'ansible_connection' from source: unknown 25039 1726867460.52340: variable 'ansible_module_compression' from source: unknown 25039 1726867460.52343: variable 'ansible_shell_type' from source: unknown 25039 1726867460.52345: variable 'ansible_shell_executable' from source: unknown 25039 1726867460.52348: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.52369: variable 'ansible_pipelining' from source: unknown 25039 1726867460.52432: variable 'ansible_timeout' from source: unknown 25039 1726867460.52435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.52550: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867460.52565: variable 'omit' from source: magic vars 25039 1726867460.52576: starting attempt loop 25039 1726867460.52650: running the handler 25039 1726867460.52750: variable 'interface_stat' from source: set_fact 25039 1726867460.52782: Evaluated conditional (interface_stat.stat.exists): True 25039 1726867460.52793: handler run complete 25039 1726867460.52818: attempt loop complete, returning result 25039 1726867460.52825: _execute() done 25039 1726867460.52831: dumping result to json 25039 1726867460.52868: done dumping result, returning 25039 1726867460.52872: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'veth0' [0affcac9-a3a5-3ddc-7272-0000000002b6] 25039 1726867460.52875: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000002b6 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 25039 1726867460.53237: no more pending results, returning what we have 25039 1726867460.53240: results queue empty 25039 1726867460.53241: checking for any_errors_fatal 25039 1726867460.53248: done checking for any_errors_fatal 25039 1726867460.53248: checking for max_fail_percentage 25039 1726867460.53250: done checking for max_fail_percentage 25039 1726867460.53251: checking to see if all hosts have failed and the running result is not ok 25039 1726867460.53252: done checking to see if all hosts have failed 25039 1726867460.53252: getting the remaining hosts for this loop 25039 1726867460.53254: done getting the remaining hosts for this loop 25039 1726867460.53257: getting the next task for host managed_node1 25039 1726867460.53263: done getting next task for host managed_node1 25039 1726867460.53266: ^ task is: TASK: Include the task 'assert_profile_present.yml' 25039 1726867460.53267: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867460.53271: getting variables 25039 1726867460.53273: in VariableManager get_vars() 25039 1726867460.53307: Calling all_inventory to load vars for managed_node1 25039 1726867460.53310: Calling groups_inventory to load vars for managed_node1 25039 1726867460.53313: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867460.53322: Calling all_plugins_play to load vars for managed_node1 25039 1726867460.53325: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867460.53329: Calling groups_plugins_play to load vars for managed_node1 25039 1726867460.53846: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000002b6 25039 1726867460.53856: WORKER PROCESS EXITING 25039 1726867460.55332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867460.57638: done with get_vars() 25039 1726867460.57702: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Friday 20 September 2024 17:24:20 -0400 (0:00:00.081) 0:00:18.104 ****** 25039 1726867460.57902: entering _queue_task() for managed_node1/include_tasks 25039 1726867460.58500: worker is 1 (out of 1 available) 25039 1726867460.58512: exiting _queue_task() for managed_node1/include_tasks 25039 1726867460.58523: done queuing things up, now waiting for results queue to drain 25039 1726867460.58524: waiting for pending results... 25039 1726867460.58767: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 25039 1726867460.58920: in run() - task 0affcac9-a3a5-3ddc-7272-00000000005d 25039 1726867460.58938: variable 'ansible_search_path' from source: unknown 25039 1726867460.59022: calling self._execute() 25039 1726867460.59125: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.59137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.59150: variable 'omit' from source: magic vars 25039 1726867460.59997: variable 'ansible_distribution_major_version' from source: facts 25039 1726867460.60001: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867460.60003: _execute() done 25039 1726867460.60006: dumping result to json 25039 1726867460.60010: done dumping result, returning 25039 1726867460.60012: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [0affcac9-a3a5-3ddc-7272-00000000005d] 25039 1726867460.60014: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000005d 25039 1726867460.60231: no more pending results, returning what we have 25039 1726867460.60237: in VariableManager get_vars() 25039 1726867460.60288: Calling all_inventory to load vars for managed_node1 25039 1726867460.60292: Calling groups_inventory to load vars for managed_node1 25039 1726867460.60294: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867460.60483: Calling all_plugins_play to load vars for managed_node1 25039 1726867460.60488: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867460.60492: Calling groups_plugins_play to load vars for managed_node1 25039 1726867460.61283: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000005d 25039 1726867460.61287: WORKER PROCESS EXITING 25039 1726867460.63160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867460.65169: done with get_vars() 25039 1726867460.65193: variable 'ansible_search_path' from source: unknown 25039 1726867460.65214: we have included files to process 25039 1726867460.65216: generating all_blocks data 25039 1726867460.65218: done generating all_blocks data 25039 1726867460.65223: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25039 1726867460.65224: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25039 1726867460.65227: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25039 1726867460.65445: in VariableManager get_vars() 25039 1726867460.65469: done with get_vars() 25039 1726867460.65758: done processing included file 25039 1726867460.65760: iterating over new_blocks loaded from include file 25039 1726867460.65761: in VariableManager get_vars() 25039 1726867460.65781: done with get_vars() 25039 1726867460.65783: filtering new block on tags 25039 1726867460.65804: done filtering new block on tags 25039 1726867460.65807: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 25039 1726867460.65814: extending task lists for all hosts with included blocks 25039 1726867460.68390: done extending task lists 25039 1726867460.68392: done processing included files 25039 1726867460.68393: results queue empty 25039 1726867460.68394: checking for any_errors_fatal 25039 1726867460.68396: done checking for any_errors_fatal 25039 1726867460.68397: checking for max_fail_percentage 25039 1726867460.68398: done checking for max_fail_percentage 25039 1726867460.68399: checking to see if all hosts have failed and the running result is not ok 25039 1726867460.68400: done checking to see if all hosts have failed 25039 1726867460.68401: getting the remaining hosts for this loop 25039 1726867460.68402: done getting the remaining hosts for this loop 25039 1726867460.68404: getting the next task for host managed_node1 25039 1726867460.68410: done getting next task for host managed_node1 25039 1726867460.68412: ^ task is: TASK: Include the task 'get_profile_stat.yml' 25039 1726867460.68415: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867460.68417: getting variables 25039 1726867460.68418: in VariableManager get_vars() 25039 1726867460.68438: Calling all_inventory to load vars for managed_node1 25039 1726867460.68440: Calling groups_inventory to load vars for managed_node1 25039 1726867460.68442: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867460.68448: Calling all_plugins_play to load vars for managed_node1 25039 1726867460.68450: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867460.68454: Calling groups_plugins_play to load vars for managed_node1 25039 1726867460.69783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867460.71448: done with get_vars() 25039 1726867460.71480: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:24:20 -0400 (0:00:00.136) 0:00:18.241 ****** 25039 1726867460.71557: entering _queue_task() for managed_node1/include_tasks 25039 1726867460.72098: worker is 1 (out of 1 available) 25039 1726867460.72111: exiting _queue_task() for managed_node1/include_tasks 25039 1726867460.72121: done queuing things up, now waiting for results queue to drain 25039 1726867460.72122: waiting for pending results... 25039 1726867460.72258: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 25039 1726867460.72376: in run() - task 0affcac9-a3a5-3ddc-7272-0000000003b8 25039 1726867460.72398: variable 'ansible_search_path' from source: unknown 25039 1726867460.72410: variable 'ansible_search_path' from source: unknown 25039 1726867460.72460: calling self._execute() 25039 1726867460.72558: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.72584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.72687: variable 'omit' from source: magic vars 25039 1726867460.73028: variable 'ansible_distribution_major_version' from source: facts 25039 1726867460.73116: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867460.73119: _execute() done 25039 1726867460.73121: dumping result to json 25039 1726867460.73126: done dumping result, returning 25039 1726867460.73132: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-3ddc-7272-0000000003b8] 25039 1726867460.73134: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003b8 25039 1726867460.73194: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003b8 25039 1726867460.73197: WORKER PROCESS EXITING 25039 1726867460.73264: no more pending results, returning what we have 25039 1726867460.73271: in VariableManager get_vars() 25039 1726867460.73329: Calling all_inventory to load vars for managed_node1 25039 1726867460.73333: Calling groups_inventory to load vars for managed_node1 25039 1726867460.73336: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867460.73354: Calling all_plugins_play to load vars for managed_node1 25039 1726867460.73358: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867460.73361: Calling groups_plugins_play to load vars for managed_node1 25039 1726867460.75005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867460.76820: done with get_vars() 25039 1726867460.76846: variable 'ansible_search_path' from source: unknown 25039 1726867460.76848: variable 'ansible_search_path' from source: unknown 25039 1726867460.76888: we have included files to process 25039 1726867460.76889: generating all_blocks data 25039 1726867460.76891: done generating all_blocks data 25039 1726867460.76893: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25039 1726867460.76894: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25039 1726867460.76896: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25039 1726867460.78031: done processing included file 25039 1726867460.78038: iterating over new_blocks loaded from include file 25039 1726867460.78040: in VariableManager get_vars() 25039 1726867460.78060: done with get_vars() 25039 1726867460.78062: filtering new block on tags 25039 1726867460.78087: done filtering new block on tags 25039 1726867460.78090: in VariableManager get_vars() 25039 1726867460.78111: done with get_vars() 25039 1726867460.78113: filtering new block on tags 25039 1726867460.78134: done filtering new block on tags 25039 1726867460.78137: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 25039 1726867460.78149: extending task lists for all hosts with included blocks 25039 1726867460.78332: done extending task lists 25039 1726867460.78334: done processing included files 25039 1726867460.78335: results queue empty 25039 1726867460.78335: checking for any_errors_fatal 25039 1726867460.78339: done checking for any_errors_fatal 25039 1726867460.78340: checking for max_fail_percentage 25039 1726867460.78341: done checking for max_fail_percentage 25039 1726867460.78341: checking to see if all hosts have failed and the running result is not ok 25039 1726867460.78342: done checking to see if all hosts have failed 25039 1726867460.78343: getting the remaining hosts for this loop 25039 1726867460.78344: done getting the remaining hosts for this loop 25039 1726867460.78347: getting the next task for host managed_node1 25039 1726867460.78351: done getting next task for host managed_node1 25039 1726867460.78353: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 25039 1726867460.78356: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867460.78366: getting variables 25039 1726867460.78367: in VariableManager get_vars() 25039 1726867460.78430: Calling all_inventory to load vars for managed_node1 25039 1726867460.78433: Calling groups_inventory to load vars for managed_node1 25039 1726867460.78435: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867460.78440: Calling all_plugins_play to load vars for managed_node1 25039 1726867460.78442: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867460.78445: Calling groups_plugins_play to load vars for managed_node1 25039 1726867460.79618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867460.81294: done with get_vars() 25039 1726867460.81315: done getting variables 25039 1726867460.81351: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:24:20 -0400 (0:00:00.098) 0:00:18.339 ****** 25039 1726867460.81388: entering _queue_task() for managed_node1/set_fact 25039 1726867460.81736: worker is 1 (out of 1 available) 25039 1726867460.81748: exiting _queue_task() for managed_node1/set_fact 25039 1726867460.81760: done queuing things up, now waiting for results queue to drain 25039 1726867460.81761: waiting for pending results... 25039 1726867460.81994: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 25039 1726867460.82026: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b0 25039 1726867460.82037: variable 'ansible_search_path' from source: unknown 25039 1726867460.82040: variable 'ansible_search_path' from source: unknown 25039 1726867460.82133: calling self._execute() 25039 1726867460.82157: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.82163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.82173: variable 'omit' from source: magic vars 25039 1726867460.82516: variable 'ansible_distribution_major_version' from source: facts 25039 1726867460.82528: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867460.82568: variable 'omit' from source: magic vars 25039 1726867460.82678: variable 'omit' from source: magic vars 25039 1726867460.82683: variable 'omit' from source: magic vars 25039 1726867460.82685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867460.82688: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867460.82706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867460.82722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867460.82734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867460.82762: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867460.82765: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.82768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.82856: Set connection var ansible_shell_executable to /bin/sh 25039 1726867460.82863: Set connection var ansible_timeout to 10 25039 1726867460.82868: Set connection var ansible_shell_type to sh 25039 1726867460.82870: Set connection var ansible_connection to ssh 25039 1726867460.82880: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867460.82893: Set connection var ansible_pipelining to False 25039 1726867460.82912: variable 'ansible_shell_executable' from source: unknown 25039 1726867460.82915: variable 'ansible_connection' from source: unknown 25039 1726867460.82918: variable 'ansible_module_compression' from source: unknown 25039 1726867460.82920: variable 'ansible_shell_type' from source: unknown 25039 1726867460.82922: variable 'ansible_shell_executable' from source: unknown 25039 1726867460.82924: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.82926: variable 'ansible_pipelining' from source: unknown 25039 1726867460.82928: variable 'ansible_timeout' from source: unknown 25039 1726867460.82931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.83168: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867460.83224: variable 'omit' from source: magic vars 25039 1726867460.83227: starting attempt loop 25039 1726867460.83235: running the handler 25039 1726867460.83238: handler run complete 25039 1726867460.83249: attempt loop complete, returning result 25039 1726867460.83257: _execute() done 25039 1726867460.83264: dumping result to json 25039 1726867460.83271: done dumping result, returning 25039 1726867460.83284: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-3ddc-7272-0000000004b0] 25039 1726867460.83333: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b0 25039 1726867460.83402: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b0 25039 1726867460.83406: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 25039 1726867460.83619: no more pending results, returning what we have 25039 1726867460.83623: results queue empty 25039 1726867460.83625: checking for any_errors_fatal 25039 1726867460.83627: done checking for any_errors_fatal 25039 1726867460.83627: checking for max_fail_percentage 25039 1726867460.83629: done checking for max_fail_percentage 25039 1726867460.83630: checking to see if all hosts have failed and the running result is not ok 25039 1726867460.83631: done checking to see if all hosts have failed 25039 1726867460.83632: getting the remaining hosts for this loop 25039 1726867460.83634: done getting the remaining hosts for this loop 25039 1726867460.83637: getting the next task for host managed_node1 25039 1726867460.83646: done getting next task for host managed_node1 25039 1726867460.83649: ^ task is: TASK: Stat profile file 25039 1726867460.83654: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867460.83659: getting variables 25039 1726867460.83661: in VariableManager get_vars() 25039 1726867460.83707: Calling all_inventory to load vars for managed_node1 25039 1726867460.83713: Calling groups_inventory to load vars for managed_node1 25039 1726867460.83716: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867460.83728: Calling all_plugins_play to load vars for managed_node1 25039 1726867460.83732: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867460.83735: Calling groups_plugins_play to load vars for managed_node1 25039 1726867460.85457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867460.86424: done with get_vars() 25039 1726867460.86438: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:24:20 -0400 (0:00:00.051) 0:00:18.390 ****** 25039 1726867460.86502: entering _queue_task() for managed_node1/stat 25039 1726867460.86708: worker is 1 (out of 1 available) 25039 1726867460.86720: exiting _queue_task() for managed_node1/stat 25039 1726867460.86733: done queuing things up, now waiting for results queue to drain 25039 1726867460.86734: waiting for pending results... 25039 1726867460.86902: running TaskExecutor() for managed_node1/TASK: Stat profile file 25039 1726867460.86970: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b1 25039 1726867460.86976: variable 'ansible_search_path' from source: unknown 25039 1726867460.86982: variable 'ansible_search_path' from source: unknown 25039 1726867460.87010: calling self._execute() 25039 1726867460.87073: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.87084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.87090: variable 'omit' from source: magic vars 25039 1726867460.87453: variable 'ansible_distribution_major_version' from source: facts 25039 1726867460.87566: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867460.87569: variable 'omit' from source: magic vars 25039 1726867460.87582: variable 'omit' from source: magic vars 25039 1726867460.87726: variable 'profile' from source: include params 25039 1726867460.87746: variable 'interface' from source: play vars 25039 1726867460.87828: variable 'interface' from source: play vars 25039 1726867460.87982: variable 'omit' from source: magic vars 25039 1726867460.87986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867460.87989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867460.87991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867460.87993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867460.88005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867460.88043: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867460.88051: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.88060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.88158: Set connection var ansible_shell_executable to /bin/sh 25039 1726867460.88170: Set connection var ansible_timeout to 10 25039 1726867460.88183: Set connection var ansible_shell_type to sh 25039 1726867460.88190: Set connection var ansible_connection to ssh 25039 1726867460.88205: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867460.88216: Set connection var ansible_pipelining to False 25039 1726867460.88243: variable 'ansible_shell_executable' from source: unknown 25039 1726867460.88250: variable 'ansible_connection' from source: unknown 25039 1726867460.88258: variable 'ansible_module_compression' from source: unknown 25039 1726867460.88265: variable 'ansible_shell_type' from source: unknown 25039 1726867460.88271: variable 'ansible_shell_executable' from source: unknown 25039 1726867460.88279: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867460.88293: variable 'ansible_pipelining' from source: unknown 25039 1726867460.88305: variable 'ansible_timeout' from source: unknown 25039 1726867460.88315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867460.88503: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867460.88532: variable 'omit' from source: magic vars 25039 1726867460.88550: starting attempt loop 25039 1726867460.88564: running the handler 25039 1726867460.88589: _low_level_execute_command(): starting 25039 1726867460.88617: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867460.89152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867460.89162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867460.89171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867460.89202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867460.89243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867460.89246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.89336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.91012: stdout chunk (state=3): >>>/root <<< 25039 1726867460.91147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867460.91161: stderr chunk (state=3): >>><<< 25039 1726867460.91169: stdout chunk (state=3): >>><<< 25039 1726867460.91201: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867460.91283: _low_level_execute_command(): starting 25039 1726867460.91287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340 `" && echo ansible-tmp-1726867460.9122038-25855-21788535167340="` echo /root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340 `" ) && sleep 0' 25039 1726867460.91854: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867460.91887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867460.91940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867460.92005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867460.92041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867460.92098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867460.92101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867460.92104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.92165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.94022: stdout chunk (state=3): >>>ansible-tmp-1726867460.9122038-25855-21788535167340=/root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340 <<< 25039 1726867460.94137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867460.94155: stderr chunk (state=3): >>><<< 25039 1726867460.94158: stdout chunk (state=3): >>><<< 25039 1726867460.94186: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867460.9122038-25855-21788535167340=/root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867460.94241: variable 'ansible_module_compression' from source: unknown 25039 1726867460.94332: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25039 1726867460.94335: variable 'ansible_facts' from source: unknown 25039 1726867460.94519: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/AnsiballZ_stat.py 25039 1726867460.94614: Sending initial data 25039 1726867460.94618: Sent initial data (152 bytes) 25039 1726867460.95527: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867460.95541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867460.95557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867460.95618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867460.95659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867460.95686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867460.95697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.95768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867460.97311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867460.97471: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867460.97693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpytfwswpt /root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/AnsiballZ_stat.py <<< 25039 1726867460.97697: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpytfwswpt" to remote "/root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/AnsiballZ_stat.py" <<< 25039 1726867460.98673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867460.98743: stderr chunk (state=3): >>><<< 25039 1726867460.98760: stdout chunk (state=3): >>><<< 25039 1726867460.98796: done transferring module to remote 25039 1726867460.98814: _low_level_execute_command(): starting 25039 1726867460.98824: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/ /root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/AnsiballZ_stat.py && sleep 0' 25039 1726867460.99462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867460.99518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867460.99591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867460.99641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867460.99658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867460.99730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.01497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867461.01548: stderr chunk (state=3): >>><<< 25039 1726867461.01557: stdout chunk (state=3): >>><<< 25039 1726867461.01585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867461.01600: _low_level_execute_command(): starting 25039 1726867461.01614: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/AnsiballZ_stat.py && sleep 0' 25039 1726867461.02255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867461.02270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867461.02288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867461.02305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867461.02348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867461.02362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867461.02460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867461.02486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867461.02585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.17552: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25039 1726867461.18958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867461.18993: stdout chunk (state=3): >>><<< 25039 1726867461.19206: stderr chunk (state=3): >>><<< 25039 1726867461.19210: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867461.19213: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867461.19217: _low_level_execute_command(): starting 25039 1726867461.19219: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867460.9122038-25855-21788535167340/ > /dev/null 2>&1 && sleep 0' 25039 1726867461.20297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867461.20316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867461.20328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867461.20384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867461.20404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867461.20759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.22630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867461.22684: stderr chunk (state=3): >>><<< 25039 1726867461.22698: stdout chunk (state=3): >>><<< 25039 1726867461.22718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867461.22886: handler run complete 25039 1726867461.22889: attempt loop complete, returning result 25039 1726867461.22892: _execute() done 25039 1726867461.22894: dumping result to json 25039 1726867461.22896: done dumping result, returning 25039 1726867461.22898: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affcac9-a3a5-3ddc-7272-0000000004b1] 25039 1726867461.22901: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b1 25039 1726867461.23158: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b1 25039 1726867461.23162: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 25039 1726867461.23232: no more pending results, returning what we have 25039 1726867461.23237: results queue empty 25039 1726867461.23238: checking for any_errors_fatal 25039 1726867461.23247: done checking for any_errors_fatal 25039 1726867461.23248: checking for max_fail_percentage 25039 1726867461.23249: done checking for max_fail_percentage 25039 1726867461.23250: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.23251: done checking to see if all hosts have failed 25039 1726867461.23252: getting the remaining hosts for this loop 25039 1726867461.23255: done getting the remaining hosts for this loop 25039 1726867461.23258: getting the next task for host managed_node1 25039 1726867461.23266: done getting next task for host managed_node1 25039 1726867461.23269: ^ task is: TASK: Set NM profile exist flag based on the profile files 25039 1726867461.23275: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.23282: getting variables 25039 1726867461.23284: in VariableManager get_vars() 25039 1726867461.23331: Calling all_inventory to load vars for managed_node1 25039 1726867461.23334: Calling groups_inventory to load vars for managed_node1 25039 1726867461.23337: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.23348: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.23352: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.23355: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.25692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.27452: done with get_vars() 25039 1726867461.27475: done getting variables 25039 1726867461.27551: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:24:21 -0400 (0:00:00.410) 0:00:18.801 ****** 25039 1726867461.27586: entering _queue_task() for managed_node1/set_fact 25039 1726867461.27914: worker is 1 (out of 1 available) 25039 1726867461.27928: exiting _queue_task() for managed_node1/set_fact 25039 1726867461.27948: done queuing things up, now waiting for results queue to drain 25039 1726867461.27950: waiting for pending results... 25039 1726867461.28119: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 25039 1726867461.28196: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b2 25039 1726867461.28206: variable 'ansible_search_path' from source: unknown 25039 1726867461.28209: variable 'ansible_search_path' from source: unknown 25039 1726867461.28239: calling self._execute() 25039 1726867461.28316: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.28320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.28330: variable 'omit' from source: magic vars 25039 1726867461.28591: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.28601: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.28688: variable 'profile_stat' from source: set_fact 25039 1726867461.28699: Evaluated conditional (profile_stat.stat.exists): False 25039 1726867461.28702: when evaluation is False, skipping this task 25039 1726867461.28705: _execute() done 25039 1726867461.28708: dumping result to json 25039 1726867461.28722: done dumping result, returning 25039 1726867461.28725: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-3ddc-7272-0000000004b2] 25039 1726867461.28727: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b2 25039 1726867461.28803: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b2 25039 1726867461.28806: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25039 1726867461.28885: no more pending results, returning what we have 25039 1726867461.28889: results queue empty 25039 1726867461.28890: checking for any_errors_fatal 25039 1726867461.28897: done checking for any_errors_fatal 25039 1726867461.28897: checking for max_fail_percentage 25039 1726867461.28898: done checking for max_fail_percentage 25039 1726867461.28899: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.28900: done checking to see if all hosts have failed 25039 1726867461.28901: getting the remaining hosts for this loop 25039 1726867461.28902: done getting the remaining hosts for this loop 25039 1726867461.28905: getting the next task for host managed_node1 25039 1726867461.28911: done getting next task for host managed_node1 25039 1726867461.28914: ^ task is: TASK: Get NM profile info 25039 1726867461.28917: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.28920: getting variables 25039 1726867461.28922: in VariableManager get_vars() 25039 1726867461.28962: Calling all_inventory to load vars for managed_node1 25039 1726867461.28964: Calling groups_inventory to load vars for managed_node1 25039 1726867461.28967: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.28976: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.28980: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.28983: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.29814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.31001: done with get_vars() 25039 1726867461.31021: done getting variables 25039 1726867461.31079: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:24:21 -0400 (0:00:00.035) 0:00:18.836 ****** 25039 1726867461.31108: entering _queue_task() for managed_node1/shell 25039 1726867461.31354: worker is 1 (out of 1 available) 25039 1726867461.31367: exiting _queue_task() for managed_node1/shell 25039 1726867461.31385: done queuing things up, now waiting for results queue to drain 25039 1726867461.31387: waiting for pending results... 25039 1726867461.31573: running TaskExecutor() for managed_node1/TASK: Get NM profile info 25039 1726867461.31644: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b3 25039 1726867461.31655: variable 'ansible_search_path' from source: unknown 25039 1726867461.31658: variable 'ansible_search_path' from source: unknown 25039 1726867461.31689: calling self._execute() 25039 1726867461.31759: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.31763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.31771: variable 'omit' from source: magic vars 25039 1726867461.32047: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.32056: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.32062: variable 'omit' from source: magic vars 25039 1726867461.32095: variable 'omit' from source: magic vars 25039 1726867461.32168: variable 'profile' from source: include params 25039 1726867461.32172: variable 'interface' from source: play vars 25039 1726867461.32234: variable 'interface' from source: play vars 25039 1726867461.32251: variable 'omit' from source: magic vars 25039 1726867461.32286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867461.32314: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867461.32329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867461.32343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.32354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.32379: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867461.32383: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.32385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.32451: Set connection var ansible_shell_executable to /bin/sh 25039 1726867461.32457: Set connection var ansible_timeout to 10 25039 1726867461.32465: Set connection var ansible_shell_type to sh 25039 1726867461.32467: Set connection var ansible_connection to ssh 25039 1726867461.32473: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867461.32480: Set connection var ansible_pipelining to False 25039 1726867461.32497: variable 'ansible_shell_executable' from source: unknown 25039 1726867461.32500: variable 'ansible_connection' from source: unknown 25039 1726867461.32503: variable 'ansible_module_compression' from source: unknown 25039 1726867461.32505: variable 'ansible_shell_type' from source: unknown 25039 1726867461.32510: variable 'ansible_shell_executable' from source: unknown 25039 1726867461.32512: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.32514: variable 'ansible_pipelining' from source: unknown 25039 1726867461.32517: variable 'ansible_timeout' from source: unknown 25039 1726867461.32519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.32615: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867461.32624: variable 'omit' from source: magic vars 25039 1726867461.32628: starting attempt loop 25039 1726867461.32631: running the handler 25039 1726867461.32639: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867461.32655: _low_level_execute_command(): starting 25039 1726867461.32663: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867461.33155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867461.33160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867461.33163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867461.33225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867461.33228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867461.33229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867461.33283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.34916: stdout chunk (state=3): >>>/root <<< 25039 1726867461.35043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867461.35070: stderr chunk (state=3): >>><<< 25039 1726867461.35073: stdout chunk (state=3): >>><<< 25039 1726867461.35096: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867461.35121: _low_level_execute_command(): starting 25039 1726867461.35169: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570 `" && echo ansible-tmp-1726867461.351077-25888-247275308268570="` echo /root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570 `" ) && sleep 0' 25039 1726867461.35792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867461.35825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867461.35834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867461.35837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867461.35905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.37823: stdout chunk (state=3): >>>ansible-tmp-1726867461.351077-25888-247275308268570=/root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570 <<< 25039 1726867461.37890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867461.37937: stderr chunk (state=3): >>><<< 25039 1726867461.38023: stdout chunk (state=3): >>><<< 25039 1726867461.38027: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867461.351077-25888-247275308268570=/root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867461.38030: variable 'ansible_module_compression' from source: unknown 25039 1726867461.38169: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867461.38218: variable 'ansible_facts' from source: unknown 25039 1726867461.38305: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/AnsiballZ_command.py 25039 1726867461.38683: Sending initial data 25039 1726867461.38687: Sent initial data (155 bytes) 25039 1726867461.39895: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867461.39922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867461.40064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.41533: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25039 1726867461.41549: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25039 1726867461.41552: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867461.41594: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867461.41636: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp760qdx3o /root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/AnsiballZ_command.py <<< 25039 1726867461.41641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/AnsiballZ_command.py" <<< 25039 1726867461.41698: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp760qdx3o" to remote "/root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/AnsiballZ_command.py" <<< 25039 1726867461.42235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867461.42276: stderr chunk (state=3): >>><<< 25039 1726867461.42281: stdout chunk (state=3): >>><<< 25039 1726867461.42319: done transferring module to remote 25039 1726867461.42328: _low_level_execute_command(): starting 25039 1726867461.42333: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/ /root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/AnsiballZ_command.py && sleep 0' 25039 1726867461.42974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867461.42981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867461.43030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.44780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867461.44784: stderr chunk (state=3): >>><<< 25039 1726867461.44786: stdout chunk (state=3): >>><<< 25039 1726867461.44796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867461.44803: _low_level_execute_command(): starting 25039 1726867461.44806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/AnsiballZ_command.py && sleep 0' 25039 1726867461.45230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867461.45233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867461.45235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867461.45237: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867461.45240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867461.45289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867461.45293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867461.45344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.62230: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 17:24:21.602687", "end": "2024-09-20 17:24:21.620047", "delta": "0:00:00.017360", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867461.63746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867461.63764: stderr chunk (state=3): >>><<< 25039 1726867461.63767: stdout chunk (state=3): >>><<< 25039 1726867461.63785: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 17:24:21.602687", "end": "2024-09-20 17:24:21.620047", "delta": "0:00:00.017360", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867461.63819: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867461.63825: _low_level_execute_command(): starting 25039 1726867461.63830: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867461.351077-25888-247275308268570/ > /dev/null 2>&1 && sleep 0' 25039 1726867461.64354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867461.64360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867461.64409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867461.64429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867461.64481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867461.66290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867461.66318: stderr chunk (state=3): >>><<< 25039 1726867461.66331: stdout chunk (state=3): >>><<< 25039 1726867461.66383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867461.66390: handler run complete 25039 1726867461.66392: Evaluated conditional (False): False 25039 1726867461.66395: attempt loop complete, returning result 25039 1726867461.66397: _execute() done 25039 1726867461.66400: dumping result to json 25039 1726867461.66405: done dumping result, returning 25039 1726867461.66432: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affcac9-a3a5-3ddc-7272-0000000004b3] 25039 1726867461.66435: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b3 25039 1726867461.66530: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b3 25039 1726867461.66533: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.017360", "end": "2024-09-20 17:24:21.620047", "rc": 0, "start": "2024-09-20 17:24:21.602687" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 25039 1726867461.66625: no more pending results, returning what we have 25039 1726867461.66630: results queue empty 25039 1726867461.66631: checking for any_errors_fatal 25039 1726867461.66638: done checking for any_errors_fatal 25039 1726867461.66638: checking for max_fail_percentage 25039 1726867461.66640: done checking for max_fail_percentage 25039 1726867461.66642: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.66643: done checking to see if all hosts have failed 25039 1726867461.66644: getting the remaining hosts for this loop 25039 1726867461.66646: done getting the remaining hosts for this loop 25039 1726867461.66649: getting the next task for host managed_node1 25039 1726867461.66656: done getting next task for host managed_node1 25039 1726867461.66658: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 25039 1726867461.66662: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.66667: getting variables 25039 1726867461.66668: in VariableManager get_vars() 25039 1726867461.66746: Calling all_inventory to load vars for managed_node1 25039 1726867461.66749: Calling groups_inventory to load vars for managed_node1 25039 1726867461.66751: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.66761: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.66763: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.66765: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.68133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.69117: done with get_vars() 25039 1726867461.69133: done getting variables 25039 1726867461.69176: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:24:21 -0400 (0:00:00.380) 0:00:19.217 ****** 25039 1726867461.69202: entering _queue_task() for managed_node1/set_fact 25039 1726867461.69438: worker is 1 (out of 1 available) 25039 1726867461.69451: exiting _queue_task() for managed_node1/set_fact 25039 1726867461.69462: done queuing things up, now waiting for results queue to drain 25039 1726867461.69464: waiting for pending results... 25039 1726867461.69703: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 25039 1726867461.69719: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b4 25039 1726867461.69731: variable 'ansible_search_path' from source: unknown 25039 1726867461.69736: variable 'ansible_search_path' from source: unknown 25039 1726867461.69762: calling self._execute() 25039 1726867461.69836: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.69840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.69850: variable 'omit' from source: magic vars 25039 1726867461.70120: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.70134: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.70221: variable 'nm_profile_exists' from source: set_fact 25039 1726867461.70234: Evaluated conditional (nm_profile_exists.rc == 0): True 25039 1726867461.70237: variable 'omit' from source: magic vars 25039 1726867461.70270: variable 'omit' from source: magic vars 25039 1726867461.70294: variable 'omit' from source: magic vars 25039 1726867461.70335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867461.70392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867461.70395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867461.70415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.70427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.70458: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867461.70462: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.70464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.70551: Set connection var ansible_shell_executable to /bin/sh 25039 1726867461.70558: Set connection var ansible_timeout to 10 25039 1726867461.70563: Set connection var ansible_shell_type to sh 25039 1726867461.70565: Set connection var ansible_connection to ssh 25039 1726867461.70572: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867461.70585: Set connection var ansible_pipelining to False 25039 1726867461.70605: variable 'ansible_shell_executable' from source: unknown 25039 1726867461.70615: variable 'ansible_connection' from source: unknown 25039 1726867461.70618: variable 'ansible_module_compression' from source: unknown 25039 1726867461.70620: variable 'ansible_shell_type' from source: unknown 25039 1726867461.70623: variable 'ansible_shell_executable' from source: unknown 25039 1726867461.70625: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.70627: variable 'ansible_pipelining' from source: unknown 25039 1726867461.70628: variable 'ansible_timeout' from source: unknown 25039 1726867461.70631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.70735: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867461.70744: variable 'omit' from source: magic vars 25039 1726867461.70749: starting attempt loop 25039 1726867461.70752: running the handler 25039 1726867461.70761: handler run complete 25039 1726867461.70770: attempt loop complete, returning result 25039 1726867461.70772: _execute() done 25039 1726867461.70775: dumping result to json 25039 1726867461.70778: done dumping result, returning 25039 1726867461.70787: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-3ddc-7272-0000000004b4] 25039 1726867461.70789: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b4 25039 1726867461.70865: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b4 25039 1726867461.70868: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 25039 1726867461.70927: no more pending results, returning what we have 25039 1726867461.70931: results queue empty 25039 1726867461.70932: checking for any_errors_fatal 25039 1726867461.70939: done checking for any_errors_fatal 25039 1726867461.70939: checking for max_fail_percentage 25039 1726867461.70943: done checking for max_fail_percentage 25039 1726867461.70944: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.70945: done checking to see if all hosts have failed 25039 1726867461.70945: getting the remaining hosts for this loop 25039 1726867461.70947: done getting the remaining hosts for this loop 25039 1726867461.70951: getting the next task for host managed_node1 25039 1726867461.70960: done getting next task for host managed_node1 25039 1726867461.70962: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 25039 1726867461.70965: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.70969: getting variables 25039 1726867461.70971: in VariableManager get_vars() 25039 1726867461.71059: Calling all_inventory to load vars for managed_node1 25039 1726867461.71062: Calling groups_inventory to load vars for managed_node1 25039 1726867461.71064: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.71073: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.71075: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.71080: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.71947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.72992: done with get_vars() 25039 1726867461.73008: done getting variables 25039 1726867461.73049: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867461.73147: variable 'profile' from source: include params 25039 1726867461.73150: variable 'interface' from source: play vars 25039 1726867461.73216: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:24:21 -0400 (0:00:00.040) 0:00:19.258 ****** 25039 1726867461.73258: entering _queue_task() for managed_node1/command 25039 1726867461.73511: worker is 1 (out of 1 available) 25039 1726867461.73523: exiting _queue_task() for managed_node1/command 25039 1726867461.73537: done queuing things up, now waiting for results queue to drain 25039 1726867461.73538: waiting for pending results... 25039 1726867461.73755: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-veth0 25039 1726867461.73823: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b6 25039 1726867461.73832: variable 'ansible_search_path' from source: unknown 25039 1726867461.73836: variable 'ansible_search_path' from source: unknown 25039 1726867461.73863: calling self._execute() 25039 1726867461.73940: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.73944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.73952: variable 'omit' from source: magic vars 25039 1726867461.74268: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.74279: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.74365: variable 'profile_stat' from source: set_fact 25039 1726867461.74379: Evaluated conditional (profile_stat.stat.exists): False 25039 1726867461.74382: when evaluation is False, skipping this task 25039 1726867461.74386: _execute() done 25039 1726867461.74388: dumping result to json 25039 1726867461.74391: done dumping result, returning 25039 1726867461.74397: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-veth0 [0affcac9-a3a5-3ddc-7272-0000000004b6] 25039 1726867461.74402: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b6 25039 1726867461.74484: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b6 25039 1726867461.74487: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25039 1726867461.74533: no more pending results, returning what we have 25039 1726867461.74537: results queue empty 25039 1726867461.74538: checking for any_errors_fatal 25039 1726867461.74544: done checking for any_errors_fatal 25039 1726867461.74544: checking for max_fail_percentage 25039 1726867461.74546: done checking for max_fail_percentage 25039 1726867461.74546: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.74547: done checking to see if all hosts have failed 25039 1726867461.74548: getting the remaining hosts for this loop 25039 1726867461.74549: done getting the remaining hosts for this loop 25039 1726867461.74553: getting the next task for host managed_node1 25039 1726867461.74560: done getting next task for host managed_node1 25039 1726867461.74562: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 25039 1726867461.74567: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.74571: getting variables 25039 1726867461.74572: in VariableManager get_vars() 25039 1726867461.74612: Calling all_inventory to load vars for managed_node1 25039 1726867461.74615: Calling groups_inventory to load vars for managed_node1 25039 1726867461.74617: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.74626: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.74629: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.74631: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.75533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.76485: done with get_vars() 25039 1726867461.76499: done getting variables 25039 1726867461.76555: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867461.76637: variable 'profile' from source: include params 25039 1726867461.76639: variable 'interface' from source: play vars 25039 1726867461.76683: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:24:21 -0400 (0:00:00.034) 0:00:19.292 ****** 25039 1726867461.76712: entering _queue_task() for managed_node1/set_fact 25039 1726867461.76944: worker is 1 (out of 1 available) 25039 1726867461.76954: exiting _queue_task() for managed_node1/set_fact 25039 1726867461.76968: done queuing things up, now waiting for results queue to drain 25039 1726867461.76970: waiting for pending results... 25039 1726867461.77202: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-veth0 25039 1726867461.77287: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b7 25039 1726867461.77298: variable 'ansible_search_path' from source: unknown 25039 1726867461.77301: variable 'ansible_search_path' from source: unknown 25039 1726867461.77329: calling self._execute() 25039 1726867461.77412: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.77417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.77425: variable 'omit' from source: magic vars 25039 1726867461.77689: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.77697: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.77780: variable 'profile_stat' from source: set_fact 25039 1726867461.77793: Evaluated conditional (profile_stat.stat.exists): False 25039 1726867461.77797: when evaluation is False, skipping this task 25039 1726867461.77805: _execute() done 25039 1726867461.77808: dumping result to json 25039 1726867461.77810: done dumping result, returning 25039 1726867461.77813: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0affcac9-a3a5-3ddc-7272-0000000004b7] 25039 1726867461.77819: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b7 25039 1726867461.77912: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b7 25039 1726867461.77915: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25039 1726867461.77981: no more pending results, returning what we have 25039 1726867461.77985: results queue empty 25039 1726867461.77986: checking for any_errors_fatal 25039 1726867461.77990: done checking for any_errors_fatal 25039 1726867461.77991: checking for max_fail_percentage 25039 1726867461.77992: done checking for max_fail_percentage 25039 1726867461.77993: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.77994: done checking to see if all hosts have failed 25039 1726867461.77994: getting the remaining hosts for this loop 25039 1726867461.77995: done getting the remaining hosts for this loop 25039 1726867461.77999: getting the next task for host managed_node1 25039 1726867461.78004: done getting next task for host managed_node1 25039 1726867461.78006: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 25039 1726867461.78010: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.78013: getting variables 25039 1726867461.78015: in VariableManager get_vars() 25039 1726867461.78056: Calling all_inventory to load vars for managed_node1 25039 1726867461.78058: Calling groups_inventory to load vars for managed_node1 25039 1726867461.78060: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.78069: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.78072: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.78074: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.79036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.79953: done with get_vars() 25039 1726867461.79974: done getting variables 25039 1726867461.80022: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867461.80118: variable 'profile' from source: include params 25039 1726867461.80122: variable 'interface' from source: play vars 25039 1726867461.80179: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:24:21 -0400 (0:00:00.034) 0:00:19.327 ****** 25039 1726867461.80210: entering _queue_task() for managed_node1/command 25039 1726867461.80447: worker is 1 (out of 1 available) 25039 1726867461.80462: exiting _queue_task() for managed_node1/command 25039 1726867461.80474: done queuing things up, now waiting for results queue to drain 25039 1726867461.80476: waiting for pending results... 25039 1726867461.80684: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-veth0 25039 1726867461.80783: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b8 25039 1726867461.80806: variable 'ansible_search_path' from source: unknown 25039 1726867461.80812: variable 'ansible_search_path' from source: unknown 25039 1726867461.80849: calling self._execute() 25039 1726867461.80931: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.80935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.80952: variable 'omit' from source: magic vars 25039 1726867461.81250: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.81260: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.81365: variable 'profile_stat' from source: set_fact 25039 1726867461.81382: Evaluated conditional (profile_stat.stat.exists): False 25039 1726867461.81387: when evaluation is False, skipping this task 25039 1726867461.81390: _execute() done 25039 1726867461.81393: dumping result to json 25039 1726867461.81397: done dumping result, returning 25039 1726867461.81400: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-veth0 [0affcac9-a3a5-3ddc-7272-0000000004b8] 25039 1726867461.81402: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b8 25039 1726867461.81482: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b8 25039 1726867461.81485: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25039 1726867461.81555: no more pending results, returning what we have 25039 1726867461.81558: results queue empty 25039 1726867461.81559: checking for any_errors_fatal 25039 1726867461.81562: done checking for any_errors_fatal 25039 1726867461.81563: checking for max_fail_percentage 25039 1726867461.81564: done checking for max_fail_percentage 25039 1726867461.81565: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.81566: done checking to see if all hosts have failed 25039 1726867461.81567: getting the remaining hosts for this loop 25039 1726867461.81568: done getting the remaining hosts for this loop 25039 1726867461.81571: getting the next task for host managed_node1 25039 1726867461.81576: done getting next task for host managed_node1 25039 1726867461.81580: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 25039 1726867461.81584: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.81587: getting variables 25039 1726867461.81588: in VariableManager get_vars() 25039 1726867461.81626: Calling all_inventory to load vars for managed_node1 25039 1726867461.81628: Calling groups_inventory to load vars for managed_node1 25039 1726867461.81630: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.81637: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.81639: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.81640: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.82617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.83608: done with get_vars() 25039 1726867461.83622: done getting variables 25039 1726867461.83660: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867461.83730: variable 'profile' from source: include params 25039 1726867461.83733: variable 'interface' from source: play vars 25039 1726867461.83768: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:24:21 -0400 (0:00:00.035) 0:00:19.363 ****** 25039 1726867461.83791: entering _queue_task() for managed_node1/set_fact 25039 1726867461.83981: worker is 1 (out of 1 available) 25039 1726867461.83992: exiting _queue_task() for managed_node1/set_fact 25039 1726867461.84004: done queuing things up, now waiting for results queue to drain 25039 1726867461.84006: waiting for pending results... 25039 1726867461.84167: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-veth0 25039 1726867461.84239: in run() - task 0affcac9-a3a5-3ddc-7272-0000000004b9 25039 1726867461.84249: variable 'ansible_search_path' from source: unknown 25039 1726867461.84254: variable 'ansible_search_path' from source: unknown 25039 1726867461.84281: calling self._execute() 25039 1726867461.84346: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.84350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.84359: variable 'omit' from source: magic vars 25039 1726867461.84603: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.84614: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.84695: variable 'profile_stat' from source: set_fact 25039 1726867461.84706: Evaluated conditional (profile_stat.stat.exists): False 25039 1726867461.84709: when evaluation is False, skipping this task 25039 1726867461.84715: _execute() done 25039 1726867461.84717: dumping result to json 25039 1726867461.84720: done dumping result, returning 25039 1726867461.84728: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-veth0 [0affcac9-a3a5-3ddc-7272-0000000004b9] 25039 1726867461.84732: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b9 25039 1726867461.84810: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000004b9 25039 1726867461.84813: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25039 1726867461.84858: no more pending results, returning what we have 25039 1726867461.84862: results queue empty 25039 1726867461.84863: checking for any_errors_fatal 25039 1726867461.84868: done checking for any_errors_fatal 25039 1726867461.84868: checking for max_fail_percentage 25039 1726867461.84870: done checking for max_fail_percentage 25039 1726867461.84870: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.84871: done checking to see if all hosts have failed 25039 1726867461.84872: getting the remaining hosts for this loop 25039 1726867461.84874: done getting the remaining hosts for this loop 25039 1726867461.84878: getting the next task for host managed_node1 25039 1726867461.84885: done getting next task for host managed_node1 25039 1726867461.84887: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 25039 1726867461.84890: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.84894: getting variables 25039 1726867461.84895: in VariableManager get_vars() 25039 1726867461.84926: Calling all_inventory to load vars for managed_node1 25039 1726867461.84928: Calling groups_inventory to load vars for managed_node1 25039 1726867461.84930: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.84939: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.84941: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.84944: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.85716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.86793: done with get_vars() 25039 1726867461.86824: done getting variables 25039 1726867461.86906: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867461.87033: variable 'profile' from source: include params 25039 1726867461.87036: variable 'interface' from source: play vars 25039 1726867461.87106: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:24:21 -0400 (0:00:00.033) 0:00:19.397 ****** 25039 1726867461.87146: entering _queue_task() for managed_node1/assert 25039 1726867461.87462: worker is 1 (out of 1 available) 25039 1726867461.87474: exiting _queue_task() for managed_node1/assert 25039 1726867461.87493: done queuing things up, now waiting for results queue to drain 25039 1726867461.87494: waiting for pending results... 25039 1726867461.87907: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'veth0' 25039 1726867461.87983: in run() - task 0affcac9-a3a5-3ddc-7272-0000000003b9 25039 1726867461.87988: variable 'ansible_search_path' from source: unknown 25039 1726867461.87993: variable 'ansible_search_path' from source: unknown 25039 1726867461.88039: calling self._execute() 25039 1726867461.88156: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.88160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.88162: variable 'omit' from source: magic vars 25039 1726867461.88503: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.88544: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.88550: variable 'omit' from source: magic vars 25039 1726867461.88614: variable 'omit' from source: magic vars 25039 1726867461.88790: variable 'profile' from source: include params 25039 1726867461.88795: variable 'interface' from source: play vars 25039 1726867461.88813: variable 'interface' from source: play vars 25039 1726867461.88829: variable 'omit' from source: magic vars 25039 1726867461.88864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867461.88895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867461.88925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867461.88955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.88981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.89032: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867461.89038: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.89042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.89169: Set connection var ansible_shell_executable to /bin/sh 25039 1726867461.89173: Set connection var ansible_timeout to 10 25039 1726867461.89175: Set connection var ansible_shell_type to sh 25039 1726867461.89182: Set connection var ansible_connection to ssh 25039 1726867461.89190: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867461.89193: Set connection var ansible_pipelining to False 25039 1726867461.89234: variable 'ansible_shell_executable' from source: unknown 25039 1726867461.89238: variable 'ansible_connection' from source: unknown 25039 1726867461.89241: variable 'ansible_module_compression' from source: unknown 25039 1726867461.89243: variable 'ansible_shell_type' from source: unknown 25039 1726867461.89248: variable 'ansible_shell_executable' from source: unknown 25039 1726867461.89250: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.89255: variable 'ansible_pipelining' from source: unknown 25039 1726867461.89258: variable 'ansible_timeout' from source: unknown 25039 1726867461.89260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.89363: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867461.89379: variable 'omit' from source: magic vars 25039 1726867461.89383: starting attempt loop 25039 1726867461.89385: running the handler 25039 1726867461.89454: variable 'lsr_net_profile_exists' from source: set_fact 25039 1726867461.89457: Evaluated conditional (lsr_net_profile_exists): True 25039 1726867461.89464: handler run complete 25039 1726867461.89475: attempt loop complete, returning result 25039 1726867461.89480: _execute() done 25039 1726867461.89483: dumping result to json 25039 1726867461.89486: done dumping result, returning 25039 1726867461.89496: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'veth0' [0affcac9-a3a5-3ddc-7272-0000000003b9] 25039 1726867461.89499: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003b9 25039 1726867461.89571: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003b9 25039 1726867461.89574: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 25039 1726867461.89653: no more pending results, returning what we have 25039 1726867461.89656: results queue empty 25039 1726867461.89657: checking for any_errors_fatal 25039 1726867461.89662: done checking for any_errors_fatal 25039 1726867461.89662: checking for max_fail_percentage 25039 1726867461.89664: done checking for max_fail_percentage 25039 1726867461.89664: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.89665: done checking to see if all hosts have failed 25039 1726867461.89666: getting the remaining hosts for this loop 25039 1726867461.89667: done getting the remaining hosts for this loop 25039 1726867461.89670: getting the next task for host managed_node1 25039 1726867461.89675: done getting next task for host managed_node1 25039 1726867461.89679: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 25039 1726867461.89681: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.89686: getting variables 25039 1726867461.89687: in VariableManager get_vars() 25039 1726867461.89724: Calling all_inventory to load vars for managed_node1 25039 1726867461.89727: Calling groups_inventory to load vars for managed_node1 25039 1726867461.89729: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.89738: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.89740: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.89743: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.91228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.92470: done with get_vars() 25039 1726867461.92489: done getting variables 25039 1726867461.92534: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867461.92618: variable 'profile' from source: include params 25039 1726867461.92622: variable 'interface' from source: play vars 25039 1726867461.92660: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:24:21 -0400 (0:00:00.055) 0:00:19.452 ****** 25039 1726867461.92688: entering _queue_task() for managed_node1/assert 25039 1726867461.92933: worker is 1 (out of 1 available) 25039 1726867461.92945: exiting _queue_task() for managed_node1/assert 25039 1726867461.92957: done queuing things up, now waiting for results queue to drain 25039 1726867461.92959: waiting for pending results... 25039 1726867461.93131: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'veth0' 25039 1726867461.93199: in run() - task 0affcac9-a3a5-3ddc-7272-0000000003ba 25039 1726867461.93213: variable 'ansible_search_path' from source: unknown 25039 1726867461.93216: variable 'ansible_search_path' from source: unknown 25039 1726867461.93242: calling self._execute() 25039 1726867461.93321: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.93324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.93334: variable 'omit' from source: magic vars 25039 1726867461.93580: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.93589: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.93595: variable 'omit' from source: magic vars 25039 1726867461.93629: variable 'omit' from source: magic vars 25039 1726867461.93696: variable 'profile' from source: include params 25039 1726867461.93700: variable 'interface' from source: play vars 25039 1726867461.93748: variable 'interface' from source: play vars 25039 1726867461.93762: variable 'omit' from source: magic vars 25039 1726867461.93797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867461.93824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867461.93844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867461.93857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.93867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.93894: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867461.93897: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.93900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.93969: Set connection var ansible_shell_executable to /bin/sh 25039 1726867461.93974: Set connection var ansible_timeout to 10 25039 1726867461.93981: Set connection var ansible_shell_type to sh 25039 1726867461.93983: Set connection var ansible_connection to ssh 25039 1726867461.93990: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867461.93995: Set connection var ansible_pipelining to False 25039 1726867461.94013: variable 'ansible_shell_executable' from source: unknown 25039 1726867461.94016: variable 'ansible_connection' from source: unknown 25039 1726867461.94019: variable 'ansible_module_compression' from source: unknown 25039 1726867461.94021: variable 'ansible_shell_type' from source: unknown 25039 1726867461.94023: variable 'ansible_shell_executable' from source: unknown 25039 1726867461.94025: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.94030: variable 'ansible_pipelining' from source: unknown 25039 1726867461.94032: variable 'ansible_timeout' from source: unknown 25039 1726867461.94036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.94135: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867461.94144: variable 'omit' from source: magic vars 25039 1726867461.94148: starting attempt loop 25039 1726867461.94151: running the handler 25039 1726867461.94229: variable 'lsr_net_profile_ansible_managed' from source: set_fact 25039 1726867461.94232: Evaluated conditional (lsr_net_profile_ansible_managed): True 25039 1726867461.94239: handler run complete 25039 1726867461.94251: attempt loop complete, returning result 25039 1726867461.94253: _execute() done 25039 1726867461.94256: dumping result to json 25039 1726867461.94258: done dumping result, returning 25039 1726867461.94266: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'veth0' [0affcac9-a3a5-3ddc-7272-0000000003ba] 25039 1726867461.94269: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003ba 25039 1726867461.94348: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003ba 25039 1726867461.94351: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 25039 1726867461.94428: no more pending results, returning what we have 25039 1726867461.94431: results queue empty 25039 1726867461.94432: checking for any_errors_fatal 25039 1726867461.94437: done checking for any_errors_fatal 25039 1726867461.94437: checking for max_fail_percentage 25039 1726867461.94439: done checking for max_fail_percentage 25039 1726867461.94440: checking to see if all hosts have failed and the running result is not ok 25039 1726867461.94441: done checking to see if all hosts have failed 25039 1726867461.94442: getting the remaining hosts for this loop 25039 1726867461.94443: done getting the remaining hosts for this loop 25039 1726867461.94446: getting the next task for host managed_node1 25039 1726867461.94452: done getting next task for host managed_node1 25039 1726867461.94455: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 25039 1726867461.94457: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867461.94461: getting variables 25039 1726867461.94462: in VariableManager get_vars() 25039 1726867461.94495: Calling all_inventory to load vars for managed_node1 25039 1726867461.94497: Calling groups_inventory to load vars for managed_node1 25039 1726867461.94500: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867461.94511: Calling all_plugins_play to load vars for managed_node1 25039 1726867461.94513: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867461.94516: Calling groups_plugins_play to load vars for managed_node1 25039 1726867461.95841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867461.97639: done with get_vars() 25039 1726867461.97661: done getting variables 25039 1726867461.97723: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867461.97845: variable 'profile' from source: include params 25039 1726867461.97849: variable 'interface' from source: play vars 25039 1726867461.97912: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:24:21 -0400 (0:00:00.052) 0:00:19.505 ****** 25039 1726867461.97958: entering _queue_task() for managed_node1/assert 25039 1726867461.98492: worker is 1 (out of 1 available) 25039 1726867461.98501: exiting _queue_task() for managed_node1/assert 25039 1726867461.98514: done queuing things up, now waiting for results queue to drain 25039 1726867461.98516: waiting for pending results... 25039 1726867461.98647: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in veth0 25039 1726867461.98735: in run() - task 0affcac9-a3a5-3ddc-7272-0000000003bb 25039 1726867461.98854: variable 'ansible_search_path' from source: unknown 25039 1726867461.98857: variable 'ansible_search_path' from source: unknown 25039 1726867461.98860: calling self._execute() 25039 1726867461.98916: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.98930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867461.98947: variable 'omit' from source: magic vars 25039 1726867461.99346: variable 'ansible_distribution_major_version' from source: facts 25039 1726867461.99366: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867461.99380: variable 'omit' from source: magic vars 25039 1726867461.99436: variable 'omit' from source: magic vars 25039 1726867461.99547: variable 'profile' from source: include params 25039 1726867461.99558: variable 'interface' from source: play vars 25039 1726867461.99636: variable 'interface' from source: play vars 25039 1726867461.99659: variable 'omit' from source: magic vars 25039 1726867461.99704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867461.99756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867461.99786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867461.99830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.99833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867461.99941: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867461.99945: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867461.99947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.00000: Set connection var ansible_shell_executable to /bin/sh 25039 1726867462.00017: Set connection var ansible_timeout to 10 25039 1726867462.00030: Set connection var ansible_shell_type to sh 25039 1726867462.00033: Set connection var ansible_connection to ssh 25039 1726867462.00041: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867462.00059: Set connection var ansible_pipelining to False 25039 1726867462.00098: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.00101: variable 'ansible_connection' from source: unknown 25039 1726867462.00104: variable 'ansible_module_compression' from source: unknown 25039 1726867462.00106: variable 'ansible_shell_type' from source: unknown 25039 1726867462.00108: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.00110: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.00112: variable 'ansible_pipelining' from source: unknown 25039 1726867462.00115: variable 'ansible_timeout' from source: unknown 25039 1726867462.00117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.00221: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867462.00229: variable 'omit' from source: magic vars 25039 1726867462.00234: starting attempt loop 25039 1726867462.00237: running the handler 25039 1726867462.00315: variable 'lsr_net_profile_fingerprint' from source: set_fact 25039 1726867462.00320: Evaluated conditional (lsr_net_profile_fingerprint): True 25039 1726867462.00326: handler run complete 25039 1726867462.00337: attempt loop complete, returning result 25039 1726867462.00339: _execute() done 25039 1726867462.00342: dumping result to json 25039 1726867462.00344: done dumping result, returning 25039 1726867462.00350: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in veth0 [0affcac9-a3a5-3ddc-7272-0000000003bb] 25039 1726867462.00354: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003bb 25039 1726867462.00435: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000003bb 25039 1726867462.00438: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 25039 1726867462.00523: no more pending results, returning what we have 25039 1726867462.00526: results queue empty 25039 1726867462.00526: checking for any_errors_fatal 25039 1726867462.00530: done checking for any_errors_fatal 25039 1726867462.00531: checking for max_fail_percentage 25039 1726867462.00532: done checking for max_fail_percentage 25039 1726867462.00533: checking to see if all hosts have failed and the running result is not ok 25039 1726867462.00534: done checking to see if all hosts have failed 25039 1726867462.00535: getting the remaining hosts for this loop 25039 1726867462.00536: done getting the remaining hosts for this loop 25039 1726867462.00539: getting the next task for host managed_node1 25039 1726867462.00552: done getting next task for host managed_node1 25039 1726867462.00555: ^ task is: TASK: Get ip address information 25039 1726867462.00557: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867462.00561: getting variables 25039 1726867462.00562: in VariableManager get_vars() 25039 1726867462.00596: Calling all_inventory to load vars for managed_node1 25039 1726867462.00599: Calling groups_inventory to load vars for managed_node1 25039 1726867462.00601: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867462.00610: Calling all_plugins_play to load vars for managed_node1 25039 1726867462.00612: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867462.00615: Calling groups_plugins_play to load vars for managed_node1 25039 1726867462.01370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867462.02251: done with get_vars() 25039 1726867462.02266: done getting variables 25039 1726867462.02310: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Friday 20 September 2024 17:24:22 -0400 (0:00:00.043) 0:00:19.548 ****** 25039 1726867462.02329: entering _queue_task() for managed_node1/command 25039 1726867462.02547: worker is 1 (out of 1 available) 25039 1726867462.02560: exiting _queue_task() for managed_node1/command 25039 1726867462.02573: done queuing things up, now waiting for results queue to drain 25039 1726867462.02578: waiting for pending results... 25039 1726867462.02873: running TaskExecutor() for managed_node1/TASK: Get ip address information 25039 1726867462.02881: in run() - task 0affcac9-a3a5-3ddc-7272-00000000005e 25039 1726867462.02886: variable 'ansible_search_path' from source: unknown 25039 1726867462.02946: calling self._execute() 25039 1726867462.03065: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.03069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.03071: variable 'omit' from source: magic vars 25039 1726867462.03444: variable 'ansible_distribution_major_version' from source: facts 25039 1726867462.03461: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867462.03473: variable 'omit' from source: magic vars 25039 1726867462.03618: variable 'omit' from source: magic vars 25039 1726867462.03624: variable 'interface' from source: play vars 25039 1726867462.03643: variable 'omit' from source: magic vars 25039 1726867462.03692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867462.03723: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867462.03742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867462.03751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.03761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.03787: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867462.03790: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.03792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.03864: Set connection var ansible_shell_executable to /bin/sh 25039 1726867462.03869: Set connection var ansible_timeout to 10 25039 1726867462.03874: Set connection var ansible_shell_type to sh 25039 1726867462.03878: Set connection var ansible_connection to ssh 25039 1726867462.03885: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867462.03891: Set connection var ansible_pipelining to False 25039 1726867462.03915: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.03919: variable 'ansible_connection' from source: unknown 25039 1726867462.03921: variable 'ansible_module_compression' from source: unknown 25039 1726867462.03924: variable 'ansible_shell_type' from source: unknown 25039 1726867462.03926: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.03929: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.03931: variable 'ansible_pipelining' from source: unknown 25039 1726867462.03933: variable 'ansible_timeout' from source: unknown 25039 1726867462.03935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.04034: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867462.04042: variable 'omit' from source: magic vars 25039 1726867462.04046: starting attempt loop 25039 1726867462.04049: running the handler 25039 1726867462.04068: _low_level_execute_command(): starting 25039 1726867462.04071: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867462.04559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.04564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867462.04567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.04624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.04630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.04682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.06419: stdout chunk (state=3): >>>/root <<< 25039 1726867462.06521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.06546: stderr chunk (state=3): >>><<< 25039 1726867462.06549: stdout chunk (state=3): >>><<< 25039 1726867462.06570: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867462.06583: _low_level_execute_command(): starting 25039 1726867462.06593: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092 `" && echo ansible-tmp-1726867462.065694-25919-95846067255092="` echo /root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092 `" ) && sleep 0' 25039 1726867462.07220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867462.07223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867462.07226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.07229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867462.07231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867462.07233: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867462.07243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.07246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867462.07328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867462.07345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.07358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.07436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.09355: stdout chunk (state=3): >>>ansible-tmp-1726867462.065694-25919-95846067255092=/root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092 <<< 25039 1726867462.09503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.09525: stderr chunk (state=3): >>><<< 25039 1726867462.09535: stdout chunk (state=3): >>><<< 25039 1726867462.09559: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867462.065694-25919-95846067255092=/root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867462.09599: variable 'ansible_module_compression' from source: unknown 25039 1726867462.09734: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867462.09737: variable 'ansible_facts' from source: unknown 25039 1726867462.09803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/AnsiballZ_command.py 25039 1726867462.09971: Sending initial data 25039 1726867462.09983: Sent initial data (154 bytes) 25039 1726867462.10585: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867462.10688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.10745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.10789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.12327: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867462.12388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867462.12444: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpx2uhsm24 /root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/AnsiballZ_command.py <<< 25039 1726867462.12447: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/AnsiballZ_command.py" <<< 25039 1726867462.12490: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpx2uhsm24" to remote "/root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/AnsiballZ_command.py" <<< 25039 1726867462.13343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.13347: stdout chunk (state=3): >>><<< 25039 1726867462.13349: stderr chunk (state=3): >>><<< 25039 1726867462.13382: done transferring module to remote 25039 1726867462.13398: _low_level_execute_command(): starting 25039 1726867462.13414: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/ /root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/AnsiballZ_command.py && sleep 0' 25039 1726867462.14082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867462.14105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.14175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867462.14210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.14254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.14305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.16137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.16141: stdout chunk (state=3): >>><<< 25039 1726867462.16144: stderr chunk (state=3): >>><<< 25039 1726867462.16163: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867462.16171: _low_level_execute_command(): starting 25039 1726867462.16184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/AnsiballZ_command.py && sleep 0' 25039 1726867462.16865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.16869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.16872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.16874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.16917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867462.16933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.16960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.17056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.32402: stdout chunk (state=3): >>> {"changed": true, "stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether c6:32:56:ba:34:22 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::c432:56ff:feba:3422/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 17:24:22.318259", "end": "2024-09-20 17:24:22.322073", "delta": "0:00:00.003814", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867462.33897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867462.33926: stderr chunk (state=3): >>><<< 25039 1726867462.33930: stdout chunk (state=3): >>><<< 25039 1726867462.33953: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether c6:32:56:ba:34:22 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::c432:56ff:feba:3422/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 17:24:22.318259", "end": "2024-09-20 17:24:22.322073", "delta": "0:00:00.003814", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867462.33995: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867462.34002: _low_level_execute_command(): starting 25039 1726867462.34025: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867462.065694-25919-95846067255092/ > /dev/null 2>&1 && sleep 0' 25039 1726867462.34576: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867462.34581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.34584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867462.34587: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.34589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.34640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867462.34646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.34651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.34719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.36514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.36518: stdout chunk (state=3): >>><<< 25039 1726867462.36526: stderr chunk (state=3): >>><<< 25039 1726867462.36538: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867462.36544: handler run complete 25039 1726867462.36565: Evaluated conditional (False): False 25039 1726867462.36573: attempt loop complete, returning result 25039 1726867462.36576: _execute() done 25039 1726867462.36580: dumping result to json 25039 1726867462.36585: done dumping result, returning 25039 1726867462.36593: done running TaskExecutor() for managed_node1/TASK: Get ip address information [0affcac9-a3a5-3ddc-7272-00000000005e] 25039 1726867462.36597: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000005e 25039 1726867462.36691: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000005e 25039 1726867462.36694: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.003814", "end": "2024-09-20 17:24:22.322073", "rc": 0, "start": "2024-09-20 17:24:22.318259" } STDOUT: 31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether c6:32:56:ba:34:22 brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::c432:56ff:feba:3422/64 scope link noprefixroute valid_lft forever preferred_lft forever 25039 1726867462.36772: no more pending results, returning what we have 25039 1726867462.36776: results queue empty 25039 1726867462.36783: checking for any_errors_fatal 25039 1726867462.36789: done checking for any_errors_fatal 25039 1726867462.36790: checking for max_fail_percentage 25039 1726867462.36792: done checking for max_fail_percentage 25039 1726867462.36793: checking to see if all hosts have failed and the running result is not ok 25039 1726867462.36794: done checking to see if all hosts have failed 25039 1726867462.36794: getting the remaining hosts for this loop 25039 1726867462.36796: done getting the remaining hosts for this loop 25039 1726867462.36799: getting the next task for host managed_node1 25039 1726867462.36813: done getting next task for host managed_node1 25039 1726867462.36816: ^ task is: TASK: Show ip_addr 25039 1726867462.36818: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867462.36822: getting variables 25039 1726867462.36823: in VariableManager get_vars() 25039 1726867462.36862: Calling all_inventory to load vars for managed_node1 25039 1726867462.36865: Calling groups_inventory to load vars for managed_node1 25039 1726867462.36867: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867462.36878: Calling all_plugins_play to load vars for managed_node1 25039 1726867462.36881: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867462.36884: Calling groups_plugins_play to load vars for managed_node1 25039 1726867462.41332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867462.42652: done with get_vars() 25039 1726867462.42670: done getting variables 25039 1726867462.42706: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Friday 20 September 2024 17:24:22 -0400 (0:00:00.403) 0:00:19.952 ****** 25039 1726867462.42726: entering _queue_task() for managed_node1/debug 25039 1726867462.43102: worker is 1 (out of 1 available) 25039 1726867462.43117: exiting _queue_task() for managed_node1/debug 25039 1726867462.43130: done queuing things up, now waiting for results queue to drain 25039 1726867462.43132: waiting for pending results... 25039 1726867462.43322: running TaskExecutor() for managed_node1/TASK: Show ip_addr 25039 1726867462.43422: in run() - task 0affcac9-a3a5-3ddc-7272-00000000005f 25039 1726867462.43429: variable 'ansible_search_path' from source: unknown 25039 1726867462.43479: calling self._execute() 25039 1726867462.43566: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.43571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.43612: variable 'omit' from source: magic vars 25039 1726867462.43957: variable 'ansible_distribution_major_version' from source: facts 25039 1726867462.43967: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867462.43973: variable 'omit' from source: magic vars 25039 1726867462.43994: variable 'omit' from source: magic vars 25039 1726867462.44028: variable 'omit' from source: magic vars 25039 1726867462.44103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867462.44135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867462.44154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867462.44189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.44207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.44238: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867462.44241: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.44248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.44352: Set connection var ansible_shell_executable to /bin/sh 25039 1726867462.44357: Set connection var ansible_timeout to 10 25039 1726867462.44363: Set connection var ansible_shell_type to sh 25039 1726867462.44366: Set connection var ansible_connection to ssh 25039 1726867462.44372: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867462.44379: Set connection var ansible_pipelining to False 25039 1726867462.44397: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.44405: variable 'ansible_connection' from source: unknown 25039 1726867462.44408: variable 'ansible_module_compression' from source: unknown 25039 1726867462.44411: variable 'ansible_shell_type' from source: unknown 25039 1726867462.44413: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.44415: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.44422: variable 'ansible_pipelining' from source: unknown 25039 1726867462.44424: variable 'ansible_timeout' from source: unknown 25039 1726867462.44426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.44532: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867462.44542: variable 'omit' from source: magic vars 25039 1726867462.44546: starting attempt loop 25039 1726867462.44549: running the handler 25039 1726867462.44709: variable 'ip_addr' from source: set_fact 25039 1726867462.44738: handler run complete 25039 1726867462.44750: attempt loop complete, returning result 25039 1726867462.44753: _execute() done 25039 1726867462.44760: dumping result to json 25039 1726867462.44763: done dumping result, returning 25039 1726867462.44766: done running TaskExecutor() for managed_node1/TASK: Show ip_addr [0affcac9-a3a5-3ddc-7272-00000000005f] 25039 1726867462.44768: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000005f 25039 1726867462.44851: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000005f ok: [managed_node1] => { "ip_addr.stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether c6:32:56:ba:34:22 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::c432:56ff:feba:3422/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 25039 1726867462.44906: no more pending results, returning what we have 25039 1726867462.44909: results queue empty 25039 1726867462.44910: checking for any_errors_fatal 25039 1726867462.44919: done checking for any_errors_fatal 25039 1726867462.44920: checking for max_fail_percentage 25039 1726867462.44921: done checking for max_fail_percentage 25039 1726867462.44922: checking to see if all hosts have failed and the running result is not ok 25039 1726867462.44923: done checking to see if all hosts have failed 25039 1726867462.44924: getting the remaining hosts for this loop 25039 1726867462.44925: done getting the remaining hosts for this loop 25039 1726867462.44928: getting the next task for host managed_node1 25039 1726867462.44934: done getting next task for host managed_node1 25039 1726867462.44936: ^ task is: TASK: Assert ipv6 addresses are correctly set 25039 1726867462.44938: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867462.44941: getting variables 25039 1726867462.44943: in VariableManager get_vars() 25039 1726867462.44982: Calling all_inventory to load vars for managed_node1 25039 1726867462.44985: Calling groups_inventory to load vars for managed_node1 25039 1726867462.44987: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867462.44997: Calling all_plugins_play to load vars for managed_node1 25039 1726867462.45000: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867462.45002: Calling groups_plugins_play to load vars for managed_node1 25039 1726867462.45591: WORKER PROCESS EXITING 25039 1726867462.45813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867462.46730: done with get_vars() 25039 1726867462.46745: done getting variables 25039 1726867462.46786: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Friday 20 September 2024 17:24:22 -0400 (0:00:00.040) 0:00:19.993 ****** 25039 1726867462.46806: entering _queue_task() for managed_node1/assert 25039 1726867462.47096: worker is 1 (out of 1 available) 25039 1726867462.47114: exiting _queue_task() for managed_node1/assert 25039 1726867462.47127: done queuing things up, now waiting for results queue to drain 25039 1726867462.47128: waiting for pending results... 25039 1726867462.47427: running TaskExecutor() for managed_node1/TASK: Assert ipv6 addresses are correctly set 25039 1726867462.47494: in run() - task 0affcac9-a3a5-3ddc-7272-000000000060 25039 1726867462.47514: variable 'ansible_search_path' from source: unknown 25039 1726867462.47541: calling self._execute() 25039 1726867462.47623: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.47629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.47637: variable 'omit' from source: magic vars 25039 1726867462.47962: variable 'ansible_distribution_major_version' from source: facts 25039 1726867462.47966: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867462.47972: variable 'omit' from source: magic vars 25039 1726867462.48013: variable 'omit' from source: magic vars 25039 1726867462.48029: variable 'omit' from source: magic vars 25039 1726867462.48063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867462.48093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867462.48111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867462.48123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.48133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.48156: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867462.48159: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.48162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.48232: Set connection var ansible_shell_executable to /bin/sh 25039 1726867462.48238: Set connection var ansible_timeout to 10 25039 1726867462.48242: Set connection var ansible_shell_type to sh 25039 1726867462.48245: Set connection var ansible_connection to ssh 25039 1726867462.48251: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867462.48256: Set connection var ansible_pipelining to False 25039 1726867462.48273: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.48275: variable 'ansible_connection' from source: unknown 25039 1726867462.48280: variable 'ansible_module_compression' from source: unknown 25039 1726867462.48283: variable 'ansible_shell_type' from source: unknown 25039 1726867462.48286: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.48289: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.48291: variable 'ansible_pipelining' from source: unknown 25039 1726867462.48294: variable 'ansible_timeout' from source: unknown 25039 1726867462.48296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.48393: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867462.48402: variable 'omit' from source: magic vars 25039 1726867462.48415: starting attempt loop 25039 1726867462.48418: running the handler 25039 1726867462.48510: variable 'ip_addr' from source: set_fact 25039 1726867462.48522: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 25039 1726867462.48601: variable 'ip_addr' from source: set_fact 25039 1726867462.48611: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 25039 1726867462.48691: variable 'ip_addr' from source: set_fact 25039 1726867462.48698: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 25039 1726867462.48710: handler run complete 25039 1726867462.48751: attempt loop complete, returning result 25039 1726867462.48754: _execute() done 25039 1726867462.48756: dumping result to json 25039 1726867462.48759: done dumping result, returning 25039 1726867462.48761: done running TaskExecutor() for managed_node1/TASK: Assert ipv6 addresses are correctly set [0affcac9-a3a5-3ddc-7272-000000000060] 25039 1726867462.48763: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000060 25039 1726867462.48832: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000060 25039 1726867462.48834: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 25039 1726867462.48896: no more pending results, returning what we have 25039 1726867462.48899: results queue empty 25039 1726867462.48900: checking for any_errors_fatal 25039 1726867462.48906: done checking for any_errors_fatal 25039 1726867462.48907: checking for max_fail_percentage 25039 1726867462.48911: done checking for max_fail_percentage 25039 1726867462.48912: checking to see if all hosts have failed and the running result is not ok 25039 1726867462.48913: done checking to see if all hosts have failed 25039 1726867462.48914: getting the remaining hosts for this loop 25039 1726867462.48915: done getting the remaining hosts for this loop 25039 1726867462.48918: getting the next task for host managed_node1 25039 1726867462.48923: done getting next task for host managed_node1 25039 1726867462.48926: ^ task is: TASK: Get ipv6 routes 25039 1726867462.48928: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867462.48931: getting variables 25039 1726867462.48933: in VariableManager get_vars() 25039 1726867462.48966: Calling all_inventory to load vars for managed_node1 25039 1726867462.48968: Calling groups_inventory to load vars for managed_node1 25039 1726867462.48970: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867462.48981: Calling all_plugins_play to load vars for managed_node1 25039 1726867462.48983: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867462.48986: Calling groups_plugins_play to load vars for managed_node1 25039 1726867462.50103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867462.51170: done with get_vars() 25039 1726867462.51193: done getting variables 25039 1726867462.51251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Friday 20 September 2024 17:24:22 -0400 (0:00:00.044) 0:00:20.038 ****** 25039 1726867462.51291: entering _queue_task() for managed_node1/command 25039 1726867462.51688: worker is 1 (out of 1 available) 25039 1726867462.51699: exiting _queue_task() for managed_node1/command 25039 1726867462.51712: done queuing things up, now waiting for results queue to drain 25039 1726867462.51714: waiting for pending results... 25039 1726867462.52108: running TaskExecutor() for managed_node1/TASK: Get ipv6 routes 25039 1726867462.52131: in run() - task 0affcac9-a3a5-3ddc-7272-000000000061 25039 1726867462.52139: variable 'ansible_search_path' from source: unknown 25039 1726867462.52172: calling self._execute() 25039 1726867462.52267: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.52272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.52286: variable 'omit' from source: magic vars 25039 1726867462.52772: variable 'ansible_distribution_major_version' from source: facts 25039 1726867462.52776: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867462.52782: variable 'omit' from source: magic vars 25039 1726867462.52876: variable 'omit' from source: magic vars 25039 1726867462.52882: variable 'omit' from source: magic vars 25039 1726867462.52971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867462.53003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867462.53059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867462.53087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.53091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.53121: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867462.53125: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.53128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.53314: Set connection var ansible_shell_executable to /bin/sh 25039 1726867462.53325: Set connection var ansible_timeout to 10 25039 1726867462.53328: Set connection var ansible_shell_type to sh 25039 1726867462.53330: Set connection var ansible_connection to ssh 25039 1726867462.53385: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867462.53420: Set connection var ansible_pipelining to False 25039 1726867462.53467: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.53472: variable 'ansible_connection' from source: unknown 25039 1726867462.53475: variable 'ansible_module_compression' from source: unknown 25039 1726867462.53485: variable 'ansible_shell_type' from source: unknown 25039 1726867462.53488: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.53490: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.53493: variable 'ansible_pipelining' from source: unknown 25039 1726867462.53495: variable 'ansible_timeout' from source: unknown 25039 1726867462.53585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.53737: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867462.53746: variable 'omit' from source: magic vars 25039 1726867462.53750: starting attempt loop 25039 1726867462.53752: running the handler 25039 1726867462.53780: _low_level_execute_command(): starting 25039 1726867462.53783: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867462.54327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867462.54337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.54358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.54361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.54440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.54509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.56483: stdout chunk (state=3): >>>/root <<< 25039 1726867462.56487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.56489: stdout chunk (state=3): >>><<< 25039 1726867462.56492: stderr chunk (state=3): >>><<< 25039 1726867462.56533: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867462.56549: _low_level_execute_command(): starting 25039 1726867462.56557: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927 `" && echo ansible-tmp-1726867462.5653536-25941-206918250159927="` echo /root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927 `" ) && sleep 0' 25039 1726867462.57795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.57837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867462.57863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.57873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867462.58186: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.58200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.59966: stdout chunk (state=3): >>>ansible-tmp-1726867462.5653536-25941-206918250159927=/root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927 <<< 25039 1726867462.60130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.60133: stdout chunk (state=3): >>><<< 25039 1726867462.60136: stderr chunk (state=3): >>><<< 25039 1726867462.60285: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867462.5653536-25941-206918250159927=/root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867462.60288: variable 'ansible_module_compression' from source: unknown 25039 1726867462.60290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867462.60292: variable 'ansible_facts' from source: unknown 25039 1726867462.60394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/AnsiballZ_command.py 25039 1726867462.60518: Sending initial data 25039 1726867462.60619: Sent initial data (156 bytes) 25039 1726867462.61144: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867462.61163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867462.61184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.61275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.61304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867462.61325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.61343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.61405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.63306: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867462.63374: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpsf3n9dyg /root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/AnsiballZ_command.py <<< 25039 1726867462.63379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/AnsiballZ_command.py" <<< 25039 1726867462.63407: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpsf3n9dyg" to remote "/root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/AnsiballZ_command.py" <<< 25039 1726867462.64920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.65003: stderr chunk (state=3): >>><<< 25039 1726867462.65013: stdout chunk (state=3): >>><<< 25039 1726867462.65058: done transferring module to remote 25039 1726867462.65075: _low_level_execute_command(): starting 25039 1726867462.65088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/ /root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/AnsiballZ_command.py && sleep 0' 25039 1726867462.65833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867462.65853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867462.65869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.65931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.65999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867462.66017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.66075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.66117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.67908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.67979: stderr chunk (state=3): >>><<< 25039 1726867462.67982: stdout chunk (state=3): >>><<< 25039 1726867462.67985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867462.67992: _low_level_execute_command(): starting 25039 1726867462.67995: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/AnsiballZ_command.py && sleep 0' 25039 1726867462.68615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867462.68625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867462.68638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.68644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867462.68649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25039 1726867462.68656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867462.68726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867462.68773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.68818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.84160: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 17:24:22.835981", "end": "2024-09-20 17:24:22.839596", "delta": "0:00:00.003615", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867462.85985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867462.85989: stdout chunk (state=3): >>><<< 25039 1726867462.85991: stderr chunk (state=3): >>><<< 25039 1726867462.85994: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 17:24:22.835981", "end": "2024-09-20 17:24:22.839596", "delta": "0:00:00.003615", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867462.85997: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867462.85999: _low_level_execute_command(): starting 25039 1726867462.86002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867462.5653536-25941-206918250159927/ > /dev/null 2>&1 && sleep 0' 25039 1726867462.86481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867462.86548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867462.86602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867462.86615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867462.86636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867462.86706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867462.88539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867462.88542: stdout chunk (state=3): >>><<< 25039 1726867462.88782: stderr chunk (state=3): >>><<< 25039 1726867462.88785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867462.88791: handler run complete 25039 1726867462.88794: Evaluated conditional (False): False 25039 1726867462.88797: attempt loop complete, returning result 25039 1726867462.88799: _execute() done 25039 1726867462.88802: dumping result to json 25039 1726867462.88804: done dumping result, returning 25039 1726867462.88807: done running TaskExecutor() for managed_node1/TASK: Get ipv6 routes [0affcac9-a3a5-3ddc-7272-000000000061] 25039 1726867462.88811: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000061 25039 1726867462.88880: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000061 25039 1726867462.88884: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003615", "end": "2024-09-20 17:24:22.839596", "rc": 0, "start": "2024-09-20 17:24:22.835981" } STDOUT: 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 25039 1726867462.88960: no more pending results, returning what we have 25039 1726867462.88964: results queue empty 25039 1726867462.88964: checking for any_errors_fatal 25039 1726867462.88969: done checking for any_errors_fatal 25039 1726867462.88969: checking for max_fail_percentage 25039 1726867462.88971: done checking for max_fail_percentage 25039 1726867462.88972: checking to see if all hosts have failed and the running result is not ok 25039 1726867462.88973: done checking to see if all hosts have failed 25039 1726867462.88974: getting the remaining hosts for this loop 25039 1726867462.88975: done getting the remaining hosts for this loop 25039 1726867462.88988: getting the next task for host managed_node1 25039 1726867462.88994: done getting next task for host managed_node1 25039 1726867462.88996: ^ task is: TASK: Show ipv6_route 25039 1726867462.88998: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867462.89001: getting variables 25039 1726867462.89003: in VariableManager get_vars() 25039 1726867462.89038: Calling all_inventory to load vars for managed_node1 25039 1726867462.89041: Calling groups_inventory to load vars for managed_node1 25039 1726867462.89043: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867462.89053: Calling all_plugins_play to load vars for managed_node1 25039 1726867462.89056: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867462.89058: Calling groups_plugins_play to load vars for managed_node1 25039 1726867462.90599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867462.92121: done with get_vars() 25039 1726867462.92139: done getting variables 25039 1726867462.92182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Friday 20 September 2024 17:24:22 -0400 (0:00:00.409) 0:00:20.447 ****** 25039 1726867462.92203: entering _queue_task() for managed_node1/debug 25039 1726867462.92445: worker is 1 (out of 1 available) 25039 1726867462.92459: exiting _queue_task() for managed_node1/debug 25039 1726867462.92472: done queuing things up, now waiting for results queue to drain 25039 1726867462.92474: waiting for pending results... 25039 1726867462.92656: running TaskExecutor() for managed_node1/TASK: Show ipv6_route 25039 1726867462.92717: in run() - task 0affcac9-a3a5-3ddc-7272-000000000062 25039 1726867462.92728: variable 'ansible_search_path' from source: unknown 25039 1726867462.92756: calling self._execute() 25039 1726867462.92837: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.92842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.92851: variable 'omit' from source: magic vars 25039 1726867462.93120: variable 'ansible_distribution_major_version' from source: facts 25039 1726867462.93130: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867462.93140: variable 'omit' from source: magic vars 25039 1726867462.93153: variable 'omit' from source: magic vars 25039 1726867462.93179: variable 'omit' from source: magic vars 25039 1726867462.93211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867462.93235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867462.93257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867462.93268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.93280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.93303: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867462.93307: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.93312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.93381: Set connection var ansible_shell_executable to /bin/sh 25039 1726867462.93387: Set connection var ansible_timeout to 10 25039 1726867462.93392: Set connection var ansible_shell_type to sh 25039 1726867462.93395: Set connection var ansible_connection to ssh 25039 1726867462.93401: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867462.93406: Set connection var ansible_pipelining to False 25039 1726867462.93424: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.93427: variable 'ansible_connection' from source: unknown 25039 1726867462.93429: variable 'ansible_module_compression' from source: unknown 25039 1726867462.93432: variable 'ansible_shell_type' from source: unknown 25039 1726867462.93434: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.93436: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.93439: variable 'ansible_pipelining' from source: unknown 25039 1726867462.93442: variable 'ansible_timeout' from source: unknown 25039 1726867462.93446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.93545: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867462.93553: variable 'omit' from source: magic vars 25039 1726867462.93557: starting attempt loop 25039 1726867462.93560: running the handler 25039 1726867462.93651: variable 'ipv6_route' from source: set_fact 25039 1726867462.93665: handler run complete 25039 1726867462.93683: attempt loop complete, returning result 25039 1726867462.93688: _execute() done 25039 1726867462.93690: dumping result to json 25039 1726867462.93693: done dumping result, returning 25039 1726867462.93696: done running TaskExecutor() for managed_node1/TASK: Show ipv6_route [0affcac9-a3a5-3ddc-7272-000000000062] 25039 1726867462.93698: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000062 25039 1726867462.93774: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000062 25039 1726867462.93779: WORKER PROCESS EXITING ok: [managed_node1] => { "ipv6_route.stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 25039 1726867462.93833: no more pending results, returning what we have 25039 1726867462.93836: results queue empty 25039 1726867462.93837: checking for any_errors_fatal 25039 1726867462.93845: done checking for any_errors_fatal 25039 1726867462.93846: checking for max_fail_percentage 25039 1726867462.93847: done checking for max_fail_percentage 25039 1726867462.93848: checking to see if all hosts have failed and the running result is not ok 25039 1726867462.93849: done checking to see if all hosts have failed 25039 1726867462.93850: getting the remaining hosts for this loop 25039 1726867462.93852: done getting the remaining hosts for this loop 25039 1726867462.93855: getting the next task for host managed_node1 25039 1726867462.93861: done getting next task for host managed_node1 25039 1726867462.93863: ^ task is: TASK: Assert default ipv6 route is set 25039 1726867462.93864: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867462.93868: getting variables 25039 1726867462.93869: in VariableManager get_vars() 25039 1726867462.93903: Calling all_inventory to load vars for managed_node1 25039 1726867462.93906: Calling groups_inventory to load vars for managed_node1 25039 1726867462.93910: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867462.93919: Calling all_plugins_play to load vars for managed_node1 25039 1726867462.93922: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867462.93924: Calling groups_plugins_play to load vars for managed_node1 25039 1726867462.95073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867462.96084: done with get_vars() 25039 1726867462.96098: done getting variables 25039 1726867462.96138: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Friday 20 September 2024 17:24:22 -0400 (0:00:00.039) 0:00:20.487 ****** 25039 1726867462.96157: entering _queue_task() for managed_node1/assert 25039 1726867462.96358: worker is 1 (out of 1 available) 25039 1726867462.96371: exiting _queue_task() for managed_node1/assert 25039 1726867462.96385: done queuing things up, now waiting for results queue to drain 25039 1726867462.96387: waiting for pending results... 25039 1726867462.96559: running TaskExecutor() for managed_node1/TASK: Assert default ipv6 route is set 25039 1726867462.96625: in run() - task 0affcac9-a3a5-3ddc-7272-000000000063 25039 1726867462.96636: variable 'ansible_search_path' from source: unknown 25039 1726867462.96663: calling self._execute() 25039 1726867462.96743: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.96748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.96756: variable 'omit' from source: magic vars 25039 1726867462.97037: variable 'ansible_distribution_major_version' from source: facts 25039 1726867462.97050: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867462.97054: variable 'omit' from source: magic vars 25039 1726867462.97071: variable 'omit' from source: magic vars 25039 1726867462.97098: variable 'omit' from source: magic vars 25039 1726867462.97130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867462.97159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867462.97175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867462.97191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.97200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867462.97227: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867462.97230: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.97232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.97304: Set connection var ansible_shell_executable to /bin/sh 25039 1726867462.97309: Set connection var ansible_timeout to 10 25039 1726867462.97317: Set connection var ansible_shell_type to sh 25039 1726867462.97319: Set connection var ansible_connection to ssh 25039 1726867462.97326: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867462.97331: Set connection var ansible_pipelining to False 25039 1726867462.97349: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.97352: variable 'ansible_connection' from source: unknown 25039 1726867462.97355: variable 'ansible_module_compression' from source: unknown 25039 1726867462.97357: variable 'ansible_shell_type' from source: unknown 25039 1726867462.97360: variable 'ansible_shell_executable' from source: unknown 25039 1726867462.97362: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867462.97364: variable 'ansible_pipelining' from source: unknown 25039 1726867462.97367: variable 'ansible_timeout' from source: unknown 25039 1726867462.97374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867462.97471: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867462.97483: variable 'omit' from source: magic vars 25039 1726867462.97489: starting attempt loop 25039 1726867462.97491: running the handler 25039 1726867462.97588: variable '__test_str' from source: task vars 25039 1726867462.97639: variable 'interface' from source: play vars 25039 1726867462.97647: variable 'ipv6_route' from source: set_fact 25039 1726867462.97658: Evaluated conditional (__test_str in ipv6_route.stdout): True 25039 1726867462.97664: handler run complete 25039 1726867462.97674: attempt loop complete, returning result 25039 1726867462.97676: _execute() done 25039 1726867462.97681: dumping result to json 25039 1726867462.97684: done dumping result, returning 25039 1726867462.97689: done running TaskExecutor() for managed_node1/TASK: Assert default ipv6 route is set [0affcac9-a3a5-3ddc-7272-000000000063] 25039 1726867462.97694: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000063 25039 1726867462.97773: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000063 25039 1726867462.97776: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 25039 1726867462.97863: no more pending results, returning what we have 25039 1726867462.97866: results queue empty 25039 1726867462.97867: checking for any_errors_fatal 25039 1726867462.97872: done checking for any_errors_fatal 25039 1726867462.97872: checking for max_fail_percentage 25039 1726867462.97874: done checking for max_fail_percentage 25039 1726867462.97874: checking to see if all hosts have failed and the running result is not ok 25039 1726867462.97875: done checking to see if all hosts have failed 25039 1726867462.97876: getting the remaining hosts for this loop 25039 1726867462.97879: done getting the remaining hosts for this loop 25039 1726867462.97882: getting the next task for host managed_node1 25039 1726867462.97887: done getting next task for host managed_node1 25039 1726867462.97889: ^ task is: TASK: Ensure ping6 command is present 25039 1726867462.97891: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867462.97894: getting variables 25039 1726867462.97895: in VariableManager get_vars() 25039 1726867462.97929: Calling all_inventory to load vars for managed_node1 25039 1726867462.97932: Calling groups_inventory to load vars for managed_node1 25039 1726867462.97934: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867462.97942: Calling all_plugins_play to load vars for managed_node1 25039 1726867462.97945: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867462.97947: Calling groups_plugins_play to load vars for managed_node1 25039 1726867462.98974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867463.00323: done with get_vars() 25039 1726867463.00339: done getting variables 25039 1726867463.00379: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Friday 20 September 2024 17:24:23 -0400 (0:00:00.042) 0:00:20.529 ****** 25039 1726867463.00398: entering _queue_task() for managed_node1/package 25039 1726867463.00594: worker is 1 (out of 1 available) 25039 1726867463.00608: exiting _queue_task() for managed_node1/package 25039 1726867463.00620: done queuing things up, now waiting for results queue to drain 25039 1726867463.00622: waiting for pending results... 25039 1726867463.00785: running TaskExecutor() for managed_node1/TASK: Ensure ping6 command is present 25039 1726867463.00842: in run() - task 0affcac9-a3a5-3ddc-7272-000000000064 25039 1726867463.00854: variable 'ansible_search_path' from source: unknown 25039 1726867463.00883: calling self._execute() 25039 1726867463.00958: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867463.00962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867463.00974: variable 'omit' from source: magic vars 25039 1726867463.01234: variable 'ansible_distribution_major_version' from source: facts 25039 1726867463.01243: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867463.01249: variable 'omit' from source: magic vars 25039 1726867463.01264: variable 'omit' from source: magic vars 25039 1726867463.01398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867463.03243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867463.03302: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867463.03306: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867463.03348: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867463.03374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867463.03473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867463.03496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867463.03516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867463.03542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867463.03552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867463.03659: variable '__network_is_ostree' from source: set_fact 25039 1726867463.03663: variable 'omit' from source: magic vars 25039 1726867463.03693: variable 'omit' from source: magic vars 25039 1726867463.03721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867463.03751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867463.03769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867463.03784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867463.03803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867463.03892: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867463.03895: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867463.03898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867463.04038: Set connection var ansible_shell_executable to /bin/sh 25039 1726867463.04041: Set connection var ansible_timeout to 10 25039 1726867463.04043: Set connection var ansible_shell_type to sh 25039 1726867463.04045: Set connection var ansible_connection to ssh 25039 1726867463.04048: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867463.04050: Set connection var ansible_pipelining to False 25039 1726867463.04052: variable 'ansible_shell_executable' from source: unknown 25039 1726867463.04055: variable 'ansible_connection' from source: unknown 25039 1726867463.04057: variable 'ansible_module_compression' from source: unknown 25039 1726867463.04059: variable 'ansible_shell_type' from source: unknown 25039 1726867463.04061: variable 'ansible_shell_executable' from source: unknown 25039 1726867463.04063: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867463.04065: variable 'ansible_pipelining' from source: unknown 25039 1726867463.04067: variable 'ansible_timeout' from source: unknown 25039 1726867463.04069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867463.04112: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867463.04125: variable 'omit' from source: magic vars 25039 1726867463.04128: starting attempt loop 25039 1726867463.04131: running the handler 25039 1726867463.04138: variable 'ansible_facts' from source: unknown 25039 1726867463.04140: variable 'ansible_facts' from source: unknown 25039 1726867463.04173: _low_level_execute_command(): starting 25039 1726867463.04186: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867463.04808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867463.04839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.04842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.04845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867463.04887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867463.04893: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867463.04896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.04899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867463.04901: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867463.04903: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25039 1726867463.04905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.04973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.04978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867463.04981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867463.04983: stderr chunk (state=3): >>>debug2: match found <<< 25039 1726867463.04985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.05043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.05072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867463.05074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.05126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.06794: stdout chunk (state=3): >>>/root <<< 25039 1726867463.06893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.06925: stderr chunk (state=3): >>><<< 25039 1726867463.06930: stdout chunk (state=3): >>><<< 25039 1726867463.06943: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867463.06953: _low_level_execute_command(): starting 25039 1726867463.06958: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251 `" && echo ansible-tmp-1726867463.069431-25975-187088125320251="` echo /root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251 `" ) && sleep 0' 25039 1726867463.07354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.07389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867463.07393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.07395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.07397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867463.07399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.07447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.07450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.07507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.09391: stdout chunk (state=3): >>>ansible-tmp-1726867463.069431-25975-187088125320251=/root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251 <<< 25039 1726867463.09502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.09526: stderr chunk (state=3): >>><<< 25039 1726867463.09529: stdout chunk (state=3): >>><<< 25039 1726867463.09541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867463.069431-25975-187088125320251=/root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867463.09568: variable 'ansible_module_compression' from source: unknown 25039 1726867463.09616: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 25039 1726867463.09653: variable 'ansible_facts' from source: unknown 25039 1726867463.09732: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/AnsiballZ_dnf.py 25039 1726867463.09825: Sending initial data 25039 1726867463.09828: Sent initial data (151 bytes) 25039 1726867463.10256: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.10259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.10261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867463.10264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867463.10266: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.10314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867463.10322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.10366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.11887: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867463.11895: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867463.11933: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867463.11981: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp7olhyuss /root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/AnsiballZ_dnf.py <<< 25039 1726867463.11984: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/AnsiballZ_dnf.py" <<< 25039 1726867463.12022: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp7olhyuss" to remote "/root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/AnsiballZ_dnf.py" <<< 25039 1726867463.12701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.12738: stderr chunk (state=3): >>><<< 25039 1726867463.12741: stdout chunk (state=3): >>><<< 25039 1726867463.12775: done transferring module to remote 25039 1726867463.12786: _low_level_execute_command(): starting 25039 1726867463.12788: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/ /root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/AnsiballZ_dnf.py && sleep 0' 25039 1726867463.13206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.13212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867463.13214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.13216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.13218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.13266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.13272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.13316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.15035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.15055: stderr chunk (state=3): >>><<< 25039 1726867463.15058: stdout chunk (state=3): >>><<< 25039 1726867463.15074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867463.15079: _low_level_execute_command(): starting 25039 1726867463.15082: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/AnsiballZ_dnf.py && sleep 0' 25039 1726867463.15461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.15492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867463.15496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867463.15499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.15501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.15503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.15550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.15554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.15609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.56490: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25039 1726867463.60585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867463.60590: stdout chunk (state=3): >>><<< 25039 1726867463.60592: stderr chunk (state=3): >>><<< 25039 1726867463.60594: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867463.60637: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867463.60641: _low_level_execute_command(): starting 25039 1726867463.60683: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867463.069431-25975-187088125320251/ > /dev/null 2>&1 && sleep 0' 25039 1726867463.61471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.61486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.61548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.61600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.61657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867463.61660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.61712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.63571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.63574: stdout chunk (state=3): >>><<< 25039 1726867463.63579: stderr chunk (state=3): >>><<< 25039 1726867463.63621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867463.63625: handler run complete 25039 1726867463.63782: attempt loop complete, returning result 25039 1726867463.63785: _execute() done 25039 1726867463.63787: dumping result to json 25039 1726867463.63790: done dumping result, returning 25039 1726867463.63792: done running TaskExecutor() for managed_node1/TASK: Ensure ping6 command is present [0affcac9-a3a5-3ddc-7272-000000000064] 25039 1726867463.63794: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000064 25039 1726867463.63862: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000064 25039 1726867463.63865: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25039 1726867463.63946: no more pending results, returning what we have 25039 1726867463.63950: results queue empty 25039 1726867463.63952: checking for any_errors_fatal 25039 1726867463.63958: done checking for any_errors_fatal 25039 1726867463.63959: checking for max_fail_percentage 25039 1726867463.63961: done checking for max_fail_percentage 25039 1726867463.63962: checking to see if all hosts have failed and the running result is not ok 25039 1726867463.63963: done checking to see if all hosts have failed 25039 1726867463.63964: getting the remaining hosts for this loop 25039 1726867463.63965: done getting the remaining hosts for this loop 25039 1726867463.63969: getting the next task for host managed_node1 25039 1726867463.64185: done getting next task for host managed_node1 25039 1726867463.64188: ^ task is: TASK: Test gateway can be pinged 25039 1726867463.64190: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867463.64194: getting variables 25039 1726867463.64196: in VariableManager get_vars() 25039 1726867463.64231: Calling all_inventory to load vars for managed_node1 25039 1726867463.64233: Calling groups_inventory to load vars for managed_node1 25039 1726867463.64236: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867463.64245: Calling all_plugins_play to load vars for managed_node1 25039 1726867463.64248: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867463.64251: Calling groups_plugins_play to load vars for managed_node1 25039 1726867463.66708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867463.70049: done with get_vars() 25039 1726867463.70078: done getting variables 25039 1726867463.70253: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Friday 20 September 2024 17:24:23 -0400 (0:00:00.698) 0:00:21.229 ****** 25039 1726867463.70425: entering _queue_task() for managed_node1/command 25039 1726867463.70847: worker is 1 (out of 1 available) 25039 1726867463.70868: exiting _queue_task() for managed_node1/command 25039 1726867463.70882: done queuing things up, now waiting for results queue to drain 25039 1726867463.70884: waiting for pending results... 25039 1726867463.71527: running TaskExecutor() for managed_node1/TASK: Test gateway can be pinged 25039 1726867463.71624: in run() - task 0affcac9-a3a5-3ddc-7272-000000000065 25039 1726867463.71628: variable 'ansible_search_path' from source: unknown 25039 1726867463.71886: calling self._execute() 25039 1726867463.71980: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867463.71984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867463.71995: variable 'omit' from source: magic vars 25039 1726867463.72424: variable 'ansible_distribution_major_version' from source: facts 25039 1726867463.72436: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867463.72442: variable 'omit' from source: magic vars 25039 1726867463.72463: variable 'omit' from source: magic vars 25039 1726867463.72604: variable 'omit' from source: magic vars 25039 1726867463.72608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867463.72610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867463.72613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867463.72615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867463.72624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867463.72653: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867463.72656: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867463.72661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867463.72761: Set connection var ansible_shell_executable to /bin/sh 25039 1726867463.72768: Set connection var ansible_timeout to 10 25039 1726867463.72774: Set connection var ansible_shell_type to sh 25039 1726867463.72778: Set connection var ansible_connection to ssh 25039 1726867463.72786: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867463.72792: Set connection var ansible_pipelining to False 25039 1726867463.72821: variable 'ansible_shell_executable' from source: unknown 25039 1726867463.72824: variable 'ansible_connection' from source: unknown 25039 1726867463.72827: variable 'ansible_module_compression' from source: unknown 25039 1726867463.72830: variable 'ansible_shell_type' from source: unknown 25039 1726867463.72832: variable 'ansible_shell_executable' from source: unknown 25039 1726867463.72834: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867463.72836: variable 'ansible_pipelining' from source: unknown 25039 1726867463.72838: variable 'ansible_timeout' from source: unknown 25039 1726867463.72843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867463.72982: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867463.72994: variable 'omit' from source: magic vars 25039 1726867463.72999: starting attempt loop 25039 1726867463.73002: running the handler 25039 1726867463.73024: _low_level_execute_command(): starting 25039 1726867463.73041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867463.73704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867463.73716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.73726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.73742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867463.73754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867463.73804: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867463.73807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.73813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867463.73817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867463.73819: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25039 1726867463.73858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.73960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.73995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867463.74095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.74185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.75762: stdout chunk (state=3): >>>/root <<< 25039 1726867463.75901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.75922: stderr chunk (state=3): >>><<< 25039 1726867463.75944: stdout chunk (state=3): >>><<< 25039 1726867463.76052: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867463.76057: _low_level_execute_command(): starting 25039 1726867463.76061: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958 `" && echo ansible-tmp-1726867463.7596526-26009-213371128109958="` echo /root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958 `" ) && sleep 0' 25039 1726867463.76574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867463.76594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.76622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.76638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867463.76693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.76748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.76762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867463.76782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.76854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.78776: stdout chunk (state=3): >>>ansible-tmp-1726867463.7596526-26009-213371128109958=/root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958 <<< 25039 1726867463.78957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.78960: stdout chunk (state=3): >>><<< 25039 1726867463.78962: stderr chunk (state=3): >>><<< 25039 1726867463.79289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867463.7596526-26009-213371128109958=/root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867463.79292: variable 'ansible_module_compression' from source: unknown 25039 1726867463.79294: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867463.79297: variable 'ansible_facts' from source: unknown 25039 1726867463.79406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/AnsiballZ_command.py 25039 1726867463.79798: Sending initial data 25039 1726867463.79805: Sent initial data (156 bytes) 25039 1726867463.80801: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867463.80822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.80836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867463.80930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.81154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867463.81181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.81254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.82809: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867463.82848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867463.82902: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp1nh658bc /root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/AnsiballZ_command.py <<< 25039 1726867463.82905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/AnsiballZ_command.py" <<< 25039 1726867463.82973: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp1nh658bc" to remote "/root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/AnsiballZ_command.py" <<< 25039 1726867463.84326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.84341: stdout chunk (state=3): >>><<< 25039 1726867463.84353: stderr chunk (state=3): >>><<< 25039 1726867463.84484: done transferring module to remote 25039 1726867463.84663: _low_level_execute_command(): starting 25039 1726867463.84666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/ /root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/AnsiballZ_command.py && sleep 0' 25039 1726867463.85776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.85866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.85930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867463.85941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.85991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.86002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867463.86143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.86211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867463.87994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867463.88040: stderr chunk (state=3): >>><<< 25039 1726867463.88049: stdout chunk (state=3): >>><<< 25039 1726867463.88187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867463.88191: _low_level_execute_command(): starting 25039 1726867463.88194: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/AnsiballZ_command.py && sleep 0' 25039 1726867463.89444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867463.89536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867463.89539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867463.89618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867463.89776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867464.05587: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 17:24:24.047123", "end": "2024-09-20 17:24:24.051217", "delta": "0:00:00.004094", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867464.06854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867464.06858: stderr chunk (state=3): >>><<< 25039 1726867464.06861: stdout chunk (state=3): >>><<< 25039 1726867464.06903: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 17:24:24.047123", "end": "2024-09-20 17:24:24.051217", "delta": "0:00:00.004094", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867464.07125: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867464.07132: _low_level_execute_command(): starting 25039 1726867464.07139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867463.7596526-26009-213371128109958/ > /dev/null 2>&1 && sleep 0' 25039 1726867464.08443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867464.08452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867464.08490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867464.08502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867464.08580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867464.08593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867464.08731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867464.08794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867464.10645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867464.10675: stderr chunk (state=3): >>><<< 25039 1726867464.10712: stdout chunk (state=3): >>><<< 25039 1726867464.10931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867464.10934: handler run complete 25039 1726867464.10936: Evaluated conditional (False): False 25039 1726867464.10938: attempt loop complete, returning result 25039 1726867464.10940: _execute() done 25039 1726867464.10942: dumping result to json 25039 1726867464.10943: done dumping result, returning 25039 1726867464.10946: done running TaskExecutor() for managed_node1/TASK: Test gateway can be pinged [0affcac9-a3a5-3ddc-7272-000000000065] 25039 1726867464.10948: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000065 25039 1726867464.11156: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000065 25039 1726867464.11161: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.004094", "end": "2024-09-20 17:24:24.051217", "rc": 0, "start": "2024-09-20 17:24:24.047123" } STDOUT: PING 2001:db8::1 (2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.050 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.050/0.050/0.050/0.000 ms 25039 1726867464.11231: no more pending results, returning what we have 25039 1726867464.11234: results queue empty 25039 1726867464.11235: checking for any_errors_fatal 25039 1726867464.11245: done checking for any_errors_fatal 25039 1726867464.11246: checking for max_fail_percentage 25039 1726867464.11248: done checking for max_fail_percentage 25039 1726867464.11249: checking to see if all hosts have failed and the running result is not ok 25039 1726867464.11250: done checking to see if all hosts have failed 25039 1726867464.11250: getting the remaining hosts for this loop 25039 1726867464.11252: done getting the remaining hosts for this loop 25039 1726867464.11256: getting the next task for host managed_node1 25039 1726867464.11263: done getting next task for host managed_node1 25039 1726867464.11266: ^ task is: TASK: TEARDOWN: remove profiles. 25039 1726867464.11267: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867464.11271: getting variables 25039 1726867464.11273: in VariableManager get_vars() 25039 1726867464.11473: Calling all_inventory to load vars for managed_node1 25039 1726867464.11476: Calling groups_inventory to load vars for managed_node1 25039 1726867464.11519: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867464.11532: Calling all_plugins_play to load vars for managed_node1 25039 1726867464.11535: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867464.11538: Calling groups_plugins_play to load vars for managed_node1 25039 1726867464.15515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867464.18348: done with get_vars() 25039 1726867464.18370: done getting variables 25039 1726867464.18629: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Friday 20 September 2024 17:24:24 -0400 (0:00:00.482) 0:00:21.712 ****** 25039 1726867464.18654: entering _queue_task() for managed_node1/debug 25039 1726867464.19614: worker is 1 (out of 1 available) 25039 1726867464.19625: exiting _queue_task() for managed_node1/debug 25039 1726867464.19635: done queuing things up, now waiting for results queue to drain 25039 1726867464.19637: waiting for pending results... 25039 1726867464.19924: running TaskExecutor() for managed_node1/TASK: TEARDOWN: remove profiles. 25039 1726867464.20112: in run() - task 0affcac9-a3a5-3ddc-7272-000000000066 25039 1726867464.20146: variable 'ansible_search_path' from source: unknown 25039 1726867464.20176: calling self._execute() 25039 1726867464.20381: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.20385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.20388: variable 'omit' from source: magic vars 25039 1726867464.21190: variable 'ansible_distribution_major_version' from source: facts 25039 1726867464.21202: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867464.21212: variable 'omit' from source: magic vars 25039 1726867464.21243: variable 'omit' from source: magic vars 25039 1726867464.21353: variable 'omit' from source: magic vars 25039 1726867464.21356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867464.21463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867464.21485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867464.21503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867464.21516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867464.21661: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867464.21664: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.21667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.21880: Set connection var ansible_shell_executable to /bin/sh 25039 1726867464.21892: Set connection var ansible_timeout to 10 25039 1726867464.21895: Set connection var ansible_shell_type to sh 25039 1726867464.21898: Set connection var ansible_connection to ssh 25039 1726867464.21904: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867464.21913: Set connection var ansible_pipelining to False 25039 1726867464.22000: variable 'ansible_shell_executable' from source: unknown 25039 1726867464.22004: variable 'ansible_connection' from source: unknown 25039 1726867464.22007: variable 'ansible_module_compression' from source: unknown 25039 1726867464.22012: variable 'ansible_shell_type' from source: unknown 25039 1726867464.22015: variable 'ansible_shell_executable' from source: unknown 25039 1726867464.22017: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.22019: variable 'ansible_pipelining' from source: unknown 25039 1726867464.22021: variable 'ansible_timeout' from source: unknown 25039 1726867464.22023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.22290: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867464.22301: variable 'omit' from source: magic vars 25039 1726867464.22483: starting attempt loop 25039 1726867464.22487: running the handler 25039 1726867464.22532: handler run complete 25039 1726867464.22549: attempt loop complete, returning result 25039 1726867464.22553: _execute() done 25039 1726867464.22555: dumping result to json 25039 1726867464.22558: done dumping result, returning 25039 1726867464.22650: done running TaskExecutor() for managed_node1/TASK: TEARDOWN: remove profiles. [0affcac9-a3a5-3ddc-7272-000000000066] 25039 1726867464.22653: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000066 25039 1726867464.22718: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000066 25039 1726867464.22722: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 25039 1726867464.22768: no more pending results, returning what we have 25039 1726867464.22771: results queue empty 25039 1726867464.22772: checking for any_errors_fatal 25039 1726867464.22783: done checking for any_errors_fatal 25039 1726867464.22783: checking for max_fail_percentage 25039 1726867464.22785: done checking for max_fail_percentage 25039 1726867464.22786: checking to see if all hosts have failed and the running result is not ok 25039 1726867464.22787: done checking to see if all hosts have failed 25039 1726867464.22788: getting the remaining hosts for this loop 25039 1726867464.22789: done getting the remaining hosts for this loop 25039 1726867464.22792: getting the next task for host managed_node1 25039 1726867464.22801: done getting next task for host managed_node1 25039 1726867464.22807: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25039 1726867464.22810: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867464.22828: getting variables 25039 1726867464.22830: in VariableManager get_vars() 25039 1726867464.22870: Calling all_inventory to load vars for managed_node1 25039 1726867464.22873: Calling groups_inventory to load vars for managed_node1 25039 1726867464.22875: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867464.22887: Calling all_plugins_play to load vars for managed_node1 25039 1726867464.22889: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867464.22892: Calling groups_plugins_play to load vars for managed_node1 25039 1726867464.25622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867464.27491: done with get_vars() 25039 1726867464.27529: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:24:24 -0400 (0:00:00.089) 0:00:21.802 ****** 25039 1726867464.27642: entering _queue_task() for managed_node1/include_tasks 25039 1726867464.28286: worker is 1 (out of 1 available) 25039 1726867464.28294: exiting _queue_task() for managed_node1/include_tasks 25039 1726867464.28303: done queuing things up, now waiting for results queue to drain 25039 1726867464.28305: waiting for pending results... 25039 1726867464.28425: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25039 1726867464.28630: in run() - task 0affcac9-a3a5-3ddc-7272-00000000006e 25039 1726867464.28634: variable 'ansible_search_path' from source: unknown 25039 1726867464.28637: variable 'ansible_search_path' from source: unknown 25039 1726867464.28639: calling self._execute() 25039 1726867464.28743: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.28747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.28750: variable 'omit' from source: magic vars 25039 1726867464.29065: variable 'ansible_distribution_major_version' from source: facts 25039 1726867464.29086: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867464.29095: _execute() done 25039 1726867464.29098: dumping result to json 25039 1726867464.29101: done dumping result, returning 25039 1726867464.29111: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-3ddc-7272-00000000006e] 25039 1726867464.29114: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000006e 25039 1726867464.29275: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000006e 25039 1726867464.29281: WORKER PROCESS EXITING 25039 1726867464.29341: no more pending results, returning what we have 25039 1726867464.29346: in VariableManager get_vars() 25039 1726867464.29510: Calling all_inventory to load vars for managed_node1 25039 1726867464.29513: Calling groups_inventory to load vars for managed_node1 25039 1726867464.29516: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867464.29526: Calling all_plugins_play to load vars for managed_node1 25039 1726867464.29529: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867464.29532: Calling groups_plugins_play to load vars for managed_node1 25039 1726867464.32002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867464.35068: done with get_vars() 25039 1726867464.35129: variable 'ansible_search_path' from source: unknown 25039 1726867464.35131: variable 'ansible_search_path' from source: unknown 25039 1726867464.35173: we have included files to process 25039 1726867464.35175: generating all_blocks data 25039 1726867464.35381: done generating all_blocks data 25039 1726867464.35386: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25039 1726867464.35387: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25039 1726867464.35391: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25039 1726867464.36323: done processing included file 25039 1726867464.36324: iterating over new_blocks loaded from include file 25039 1726867464.36326: in VariableManager get_vars() 25039 1726867464.36348: done with get_vars() 25039 1726867464.36350: filtering new block on tags 25039 1726867464.36365: done filtering new block on tags 25039 1726867464.36368: in VariableManager get_vars() 25039 1726867464.36390: done with get_vars() 25039 1726867464.36392: filtering new block on tags 25039 1726867464.36589: done filtering new block on tags 25039 1726867464.36592: in VariableManager get_vars() 25039 1726867464.36617: done with get_vars() 25039 1726867464.36620: filtering new block on tags 25039 1726867464.36638: done filtering new block on tags 25039 1726867464.36641: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 25039 1726867464.36646: extending task lists for all hosts with included blocks 25039 1726867464.37488: done extending task lists 25039 1726867464.37489: done processing included files 25039 1726867464.37490: results queue empty 25039 1726867464.37491: checking for any_errors_fatal 25039 1726867464.37494: done checking for any_errors_fatal 25039 1726867464.37495: checking for max_fail_percentage 25039 1726867464.37496: done checking for max_fail_percentage 25039 1726867464.37497: checking to see if all hosts have failed and the running result is not ok 25039 1726867464.37498: done checking to see if all hosts have failed 25039 1726867464.37499: getting the remaining hosts for this loop 25039 1726867464.37500: done getting the remaining hosts for this loop 25039 1726867464.37502: getting the next task for host managed_node1 25039 1726867464.37507: done getting next task for host managed_node1 25039 1726867464.37510: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25039 1726867464.37513: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867464.37523: getting variables 25039 1726867464.37524: in VariableManager get_vars() 25039 1726867464.37541: Calling all_inventory to load vars for managed_node1 25039 1726867464.37543: Calling groups_inventory to load vars for managed_node1 25039 1726867464.37545: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867464.37551: Calling all_plugins_play to load vars for managed_node1 25039 1726867464.37554: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867464.37557: Calling groups_plugins_play to load vars for managed_node1 25039 1726867464.39695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867464.41286: done with get_vars() 25039 1726867464.41306: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:24:24 -0400 (0:00:00.137) 0:00:21.939 ****** 25039 1726867464.41385: entering _queue_task() for managed_node1/setup 25039 1726867464.42042: worker is 1 (out of 1 available) 25039 1726867464.42055: exiting _queue_task() for managed_node1/setup 25039 1726867464.42068: done queuing things up, now waiting for results queue to drain 25039 1726867464.42069: waiting for pending results... 25039 1726867464.42622: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25039 1726867464.42926: in run() - task 0affcac9-a3a5-3ddc-7272-000000000513 25039 1726867464.42930: variable 'ansible_search_path' from source: unknown 25039 1726867464.42932: variable 'ansible_search_path' from source: unknown 25039 1726867464.42942: calling self._execute() 25039 1726867464.43050: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.43053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.43065: variable 'omit' from source: magic vars 25039 1726867464.43757: variable 'ansible_distribution_major_version' from source: facts 25039 1726867464.43768: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867464.44181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867464.46370: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867464.46447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867464.46484: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867464.46520: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867464.46552: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867464.46625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867464.46681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867464.46684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867464.46726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867464.46743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867464.46794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867464.46816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867464.46848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867464.46952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867464.46955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867464.47043: variable '__network_required_facts' from source: role '' defaults 25039 1726867464.47050: variable 'ansible_facts' from source: unknown 25039 1726867464.47867: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25039 1726867464.47876: when evaluation is False, skipping this task 25039 1726867464.47920: _execute() done 25039 1726867464.47924: dumping result to json 25039 1726867464.47926: done dumping result, returning 25039 1726867464.47930: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-3ddc-7272-000000000513] 25039 1726867464.47932: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000513 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867464.48164: no more pending results, returning what we have 25039 1726867464.48168: results queue empty 25039 1726867464.48169: checking for any_errors_fatal 25039 1726867464.48171: done checking for any_errors_fatal 25039 1726867464.48171: checking for max_fail_percentage 25039 1726867464.48173: done checking for max_fail_percentage 25039 1726867464.48174: checking to see if all hosts have failed and the running result is not ok 25039 1726867464.48175: done checking to see if all hosts have failed 25039 1726867464.48176: getting the remaining hosts for this loop 25039 1726867464.48180: done getting the remaining hosts for this loop 25039 1726867464.48184: getting the next task for host managed_node1 25039 1726867464.48194: done getting next task for host managed_node1 25039 1726867464.48197: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25039 1726867464.48201: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867464.48219: getting variables 25039 1726867464.48221: in VariableManager get_vars() 25039 1726867464.48263: Calling all_inventory to load vars for managed_node1 25039 1726867464.48265: Calling groups_inventory to load vars for managed_node1 25039 1726867464.48268: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867464.48517: Calling all_plugins_play to load vars for managed_node1 25039 1726867464.48521: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867464.48526: Calling groups_plugins_play to load vars for managed_node1 25039 1726867464.49134: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000513 25039 1726867464.49138: WORKER PROCESS EXITING 25039 1726867464.49833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867464.50713: done with get_vars() 25039 1726867464.50728: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:24:24 -0400 (0:00:00.094) 0:00:22.033 ****** 25039 1726867464.50804: entering _queue_task() for managed_node1/stat 25039 1726867464.51085: worker is 1 (out of 1 available) 25039 1726867464.51099: exiting _queue_task() for managed_node1/stat 25039 1726867464.51113: done queuing things up, now waiting for results queue to drain 25039 1726867464.51115: waiting for pending results... 25039 1726867464.51306: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 25039 1726867464.51762: in run() - task 0affcac9-a3a5-3ddc-7272-000000000515 25039 1726867464.51865: variable 'ansible_search_path' from source: unknown 25039 1726867464.51875: variable 'ansible_search_path' from source: unknown 25039 1726867464.51921: calling self._execute() 25039 1726867464.52217: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.52230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.52244: variable 'omit' from source: magic vars 25039 1726867464.53131: variable 'ansible_distribution_major_version' from source: facts 25039 1726867464.53180: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867464.53459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867464.53767: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867464.53821: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867464.53860: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867464.53927: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867464.54008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867464.54092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867464.54095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867464.54119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867464.54220: variable '__network_is_ostree' from source: set_fact 25039 1726867464.54231: Evaluated conditional (not __network_is_ostree is defined): False 25039 1726867464.54237: when evaluation is False, skipping this task 25039 1726867464.54243: _execute() done 25039 1726867464.54254: dumping result to json 25039 1726867464.54262: done dumping result, returning 25039 1726867464.54273: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-3ddc-7272-000000000515] 25039 1726867464.54308: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000515 25039 1726867464.54449: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000515 25039 1726867464.54454: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25039 1726867464.54507: no more pending results, returning what we have 25039 1726867464.54513: results queue empty 25039 1726867464.54515: checking for any_errors_fatal 25039 1726867464.54519: done checking for any_errors_fatal 25039 1726867464.54520: checking for max_fail_percentage 25039 1726867464.54522: done checking for max_fail_percentage 25039 1726867464.54523: checking to see if all hosts have failed and the running result is not ok 25039 1726867464.54524: done checking to see if all hosts have failed 25039 1726867464.54532: getting the remaining hosts for this loop 25039 1726867464.54534: done getting the remaining hosts for this loop 25039 1726867464.54537: getting the next task for host managed_node1 25039 1726867464.54544: done getting next task for host managed_node1 25039 1726867464.54548: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25039 1726867464.54552: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867464.54569: getting variables 25039 1726867464.54571: in VariableManager get_vars() 25039 1726867464.54636: Calling all_inventory to load vars for managed_node1 25039 1726867464.54639: Calling groups_inventory to load vars for managed_node1 25039 1726867464.54642: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867464.54651: Calling all_plugins_play to load vars for managed_node1 25039 1726867464.54654: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867464.54657: Calling groups_plugins_play to load vars for managed_node1 25039 1726867464.56337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867464.58008: done with get_vars() 25039 1726867464.58038: done getting variables 25039 1726867464.58109: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:24:24 -0400 (0:00:00.073) 0:00:22.107 ****** 25039 1726867464.58154: entering _queue_task() for managed_node1/set_fact 25039 1726867464.58527: worker is 1 (out of 1 available) 25039 1726867464.58787: exiting _queue_task() for managed_node1/set_fact 25039 1726867464.58797: done queuing things up, now waiting for results queue to drain 25039 1726867464.58798: waiting for pending results... 25039 1726867464.58929: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25039 1726867464.59003: in run() - task 0affcac9-a3a5-3ddc-7272-000000000516 25039 1726867464.59031: variable 'ansible_search_path' from source: unknown 25039 1726867464.59042: variable 'ansible_search_path' from source: unknown 25039 1726867464.59135: calling self._execute() 25039 1726867464.59184: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.59197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.59212: variable 'omit' from source: magic vars 25039 1726867464.59611: variable 'ansible_distribution_major_version' from source: facts 25039 1726867464.59628: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867464.59820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867464.60118: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867464.60167: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867464.60210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867464.60259: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867464.60442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867464.60445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867464.60448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867464.60451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867464.60551: variable '__network_is_ostree' from source: set_fact 25039 1726867464.60658: Evaluated conditional (not __network_is_ostree is defined): False 25039 1726867464.60662: when evaluation is False, skipping this task 25039 1726867464.60664: _execute() done 25039 1726867464.60666: dumping result to json 25039 1726867464.60671: done dumping result, returning 25039 1726867464.60675: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-3ddc-7272-000000000516] 25039 1726867464.60678: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000516 25039 1726867464.60742: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000516 25039 1726867464.60746: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25039 1726867464.60806: no more pending results, returning what we have 25039 1726867464.60810: results queue empty 25039 1726867464.60811: checking for any_errors_fatal 25039 1726867464.60819: done checking for any_errors_fatal 25039 1726867464.60820: checking for max_fail_percentage 25039 1726867464.60822: done checking for max_fail_percentage 25039 1726867464.60823: checking to see if all hosts have failed and the running result is not ok 25039 1726867464.60824: done checking to see if all hosts have failed 25039 1726867464.60825: getting the remaining hosts for this loop 25039 1726867464.60827: done getting the remaining hosts for this loop 25039 1726867464.60831: getting the next task for host managed_node1 25039 1726867464.60841: done getting next task for host managed_node1 25039 1726867464.60845: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25039 1726867464.60849: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867464.60878: getting variables 25039 1726867464.60881: in VariableManager get_vars() 25039 1726867464.60927: Calling all_inventory to load vars for managed_node1 25039 1726867464.60930: Calling groups_inventory to load vars for managed_node1 25039 1726867464.60932: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867464.60944: Calling all_plugins_play to load vars for managed_node1 25039 1726867464.60948: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867464.60951: Calling groups_plugins_play to load vars for managed_node1 25039 1726867464.62679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867464.64351: done with get_vars() 25039 1726867464.64374: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:24:24 -0400 (0:00:00.063) 0:00:22.170 ****** 25039 1726867464.64475: entering _queue_task() for managed_node1/service_facts 25039 1726867464.64854: worker is 1 (out of 1 available) 25039 1726867464.64866: exiting _queue_task() for managed_node1/service_facts 25039 1726867464.64999: done queuing things up, now waiting for results queue to drain 25039 1726867464.65001: waiting for pending results... 25039 1726867464.65296: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 25039 1726867464.65331: in run() - task 0affcac9-a3a5-3ddc-7272-000000000518 25039 1726867464.65393: variable 'ansible_search_path' from source: unknown 25039 1726867464.65397: variable 'ansible_search_path' from source: unknown 25039 1726867464.65399: calling self._execute() 25039 1726867464.65491: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.65508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.65523: variable 'omit' from source: magic vars 25039 1726867464.65904: variable 'ansible_distribution_major_version' from source: facts 25039 1726867464.65921: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867464.65979: variable 'omit' from source: magic vars 25039 1726867464.66021: variable 'omit' from source: magic vars 25039 1726867464.66061: variable 'omit' from source: magic vars 25039 1726867464.66109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867464.66147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867464.66171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867464.66201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867464.66262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867464.66266: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867464.66268: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.66270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.66371: Set connection var ansible_shell_executable to /bin/sh 25039 1726867464.66386: Set connection var ansible_timeout to 10 25039 1726867464.66394: Set connection var ansible_shell_type to sh 25039 1726867464.66399: Set connection var ansible_connection to ssh 25039 1726867464.66416: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867464.66424: Set connection var ansible_pipelining to False 25039 1726867464.66447: variable 'ansible_shell_executable' from source: unknown 25039 1726867464.66453: variable 'ansible_connection' from source: unknown 25039 1726867464.66459: variable 'ansible_module_compression' from source: unknown 25039 1726867464.66479: variable 'ansible_shell_type' from source: unknown 25039 1726867464.66482: variable 'ansible_shell_executable' from source: unknown 25039 1726867464.66483: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867464.66485: variable 'ansible_pipelining' from source: unknown 25039 1726867464.66487: variable 'ansible_timeout' from source: unknown 25039 1726867464.66518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867464.66695: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867464.66711: variable 'omit' from source: magic vars 25039 1726867464.66720: starting attempt loop 25039 1726867464.66727: running the handler 25039 1726867464.66847: _low_level_execute_command(): starting 25039 1726867464.66849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867464.67480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867464.67511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867464.67605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867464.67655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867464.67658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867464.67888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867464.67969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867464.69660: stdout chunk (state=3): >>>/root <<< 25039 1726867464.69753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867464.69793: stderr chunk (state=3): >>><<< 25039 1726867464.69795: stdout chunk (state=3): >>><<< 25039 1726867464.69810: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867464.69873: _low_level_execute_command(): starting 25039 1726867464.69879: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276 `" && echo ansible-tmp-1726867464.698153-26055-81359888172276="` echo /root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276 `" ) && sleep 0' 25039 1726867464.70516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867464.70528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867464.70546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867464.70618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867464.72517: stdout chunk (state=3): >>>ansible-tmp-1726867464.698153-26055-81359888172276=/root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276 <<< 25039 1726867464.72631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867464.72653: stderr chunk (state=3): >>><<< 25039 1726867464.72656: stdout chunk (state=3): >>><<< 25039 1726867464.72669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867464.698153-26055-81359888172276=/root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867464.72710: variable 'ansible_module_compression' from source: unknown 25039 1726867464.72747: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 25039 1726867464.72779: variable 'ansible_facts' from source: unknown 25039 1726867464.72838: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/AnsiballZ_service_facts.py 25039 1726867464.72939: Sending initial data 25039 1726867464.72943: Sent initial data (160 bytes) 25039 1726867464.73505: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867464.73559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867464.73563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867464.73617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867464.75161: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867464.75170: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867464.75202: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867464.75252: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpdijfsqmp /root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/AnsiballZ_service_facts.py <<< 25039 1726867464.75255: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/AnsiballZ_service_facts.py" <<< 25039 1726867464.75293: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpdijfsqmp" to remote "/root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/AnsiballZ_service_facts.py" <<< 25039 1726867464.75846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867464.75879: stderr chunk (state=3): >>><<< 25039 1726867464.75883: stdout chunk (state=3): >>><<< 25039 1726867464.75950: done transferring module to remote 25039 1726867464.75958: _low_level_execute_command(): starting 25039 1726867464.75962: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/ /root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/AnsiballZ_service_facts.py && sleep 0' 25039 1726867464.76370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867464.76373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867464.76375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867464.76383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867464.76427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867464.76434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867464.76479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867464.78211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867464.78231: stderr chunk (state=3): >>><<< 25039 1726867464.78235: stdout chunk (state=3): >>><<< 25039 1726867464.78246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867464.78249: _low_level_execute_command(): starting 25039 1726867464.78254: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/AnsiballZ_service_facts.py && sleep 0' 25039 1726867464.78654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867464.78657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867464.78659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867464.78661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867464.78705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867464.78708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867464.78764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867466.29571: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 25039 1726867466.29649: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25039 1726867466.31119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867466.31188: stderr chunk (state=3): >>><<< 25039 1726867466.31191: stdout chunk (state=3): >>><<< 25039 1726867466.31195: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867466.32154: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867466.32164: _low_level_execute_command(): starting 25039 1726867466.32168: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867464.698153-26055-81359888172276/ > /dev/null 2>&1 && sleep 0' 25039 1726867466.32805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867466.32808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867466.32810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867466.32813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867466.32880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867466.32927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867466.32969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867466.34792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867466.34983: stderr chunk (state=3): >>><<< 25039 1726867466.34987: stdout chunk (state=3): >>><<< 25039 1726867466.34989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867466.34992: handler run complete 25039 1726867466.35061: variable 'ansible_facts' from source: unknown 25039 1726867466.35207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867466.35687: variable 'ansible_facts' from source: unknown 25039 1726867466.35823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867466.36083: attempt loop complete, returning result 25039 1726867466.36094: _execute() done 25039 1726867466.36096: dumping result to json 25039 1726867466.36128: done dumping result, returning 25039 1726867466.36143: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-3ddc-7272-000000000518] 25039 1726867466.36152: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000518 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867466.37551: no more pending results, returning what we have 25039 1726867466.37554: results queue empty 25039 1726867466.37555: checking for any_errors_fatal 25039 1726867466.37557: done checking for any_errors_fatal 25039 1726867466.37558: checking for max_fail_percentage 25039 1726867466.37559: done checking for max_fail_percentage 25039 1726867466.37560: checking to see if all hosts have failed and the running result is not ok 25039 1726867466.37561: done checking to see if all hosts have failed 25039 1726867466.37562: getting the remaining hosts for this loop 25039 1726867466.37563: done getting the remaining hosts for this loop 25039 1726867466.37566: getting the next task for host managed_node1 25039 1726867466.37571: done getting next task for host managed_node1 25039 1726867466.37574: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25039 1726867466.37581: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867466.37590: getting variables 25039 1726867466.37591: in VariableManager get_vars() 25039 1726867466.37621: Calling all_inventory to load vars for managed_node1 25039 1726867466.37623: Calling groups_inventory to load vars for managed_node1 25039 1726867466.37626: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867466.37634: Calling all_plugins_play to load vars for managed_node1 25039 1726867466.37637: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867466.37640: Calling groups_plugins_play to load vars for managed_node1 25039 1726867466.38291: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000518 25039 1726867466.38294: WORKER PROCESS EXITING 25039 1726867466.38863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867466.40508: done with get_vars() 25039 1726867466.40528: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:24:26 -0400 (0:00:01.761) 0:00:23.931 ****** 25039 1726867466.40628: entering _queue_task() for managed_node1/package_facts 25039 1726867466.41045: worker is 1 (out of 1 available) 25039 1726867466.41056: exiting _queue_task() for managed_node1/package_facts 25039 1726867466.41066: done queuing things up, now waiting for results queue to drain 25039 1726867466.41068: waiting for pending results... 25039 1726867466.41244: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 25039 1726867466.41409: in run() - task 0affcac9-a3a5-3ddc-7272-000000000519 25039 1726867466.41429: variable 'ansible_search_path' from source: unknown 25039 1726867466.41437: variable 'ansible_search_path' from source: unknown 25039 1726867466.41480: calling self._execute() 25039 1726867466.41584: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867466.41597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867466.41613: variable 'omit' from source: magic vars 25039 1726867466.41993: variable 'ansible_distribution_major_version' from source: facts 25039 1726867466.42124: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867466.42129: variable 'omit' from source: magic vars 25039 1726867466.42132: variable 'omit' from source: magic vars 25039 1726867466.42145: variable 'omit' from source: magic vars 25039 1726867466.42189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867466.42228: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867466.42262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867466.42287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867466.42303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867466.42336: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867466.42353: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867466.42361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867466.42472: Set connection var ansible_shell_executable to /bin/sh 25039 1726867466.42486: Set connection var ansible_timeout to 10 25039 1726867466.42495: Set connection var ansible_shell_type to sh 25039 1726867466.42501: Set connection var ansible_connection to ssh 25039 1726867466.42511: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867466.42562: Set connection var ansible_pipelining to False 25039 1726867466.42567: variable 'ansible_shell_executable' from source: unknown 25039 1726867466.42572: variable 'ansible_connection' from source: unknown 25039 1726867466.42575: variable 'ansible_module_compression' from source: unknown 25039 1726867466.42578: variable 'ansible_shell_type' from source: unknown 25039 1726867466.42580: variable 'ansible_shell_executable' from source: unknown 25039 1726867466.42582: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867466.42587: variable 'ansible_pipelining' from source: unknown 25039 1726867466.42594: variable 'ansible_timeout' from source: unknown 25039 1726867466.42601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867466.42800: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867466.42891: variable 'omit' from source: magic vars 25039 1726867466.42896: starting attempt loop 25039 1726867466.42898: running the handler 25039 1726867466.42901: _low_level_execute_command(): starting 25039 1726867466.42903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867466.43671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867466.43698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867466.43721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867466.43798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867466.45456: stdout chunk (state=3): >>>/root <<< 25039 1726867466.45607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867466.45611: stdout chunk (state=3): >>><<< 25039 1726867466.45614: stderr chunk (state=3): >>><<< 25039 1726867466.45631: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867466.45649: _low_level_execute_command(): starting 25039 1726867466.45727: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868 `" && echo ansible-tmp-1726867466.4563708-26132-212176108122868="` echo /root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868 `" ) && sleep 0' 25039 1726867466.46586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867466.46597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867466.46599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867466.46685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867466.46755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867466.48618: stdout chunk (state=3): >>>ansible-tmp-1726867466.4563708-26132-212176108122868=/root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868 <<< 25039 1726867466.48903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867466.48907: stdout chunk (state=3): >>><<< 25039 1726867466.48909: stderr chunk (state=3): >>><<< 25039 1726867466.48912: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867466.4563708-26132-212176108122868=/root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867466.48914: variable 'ansible_module_compression' from source: unknown 25039 1726867466.48917: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 25039 1726867466.48970: variable 'ansible_facts' from source: unknown 25039 1726867466.49159: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/AnsiballZ_package_facts.py 25039 1726867466.49390: Sending initial data 25039 1726867466.49395: Sent initial data (162 bytes) 25039 1726867466.50298: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867466.50304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867466.50332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867466.50335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867466.50338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867466.50341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867466.50387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867466.50400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867466.50455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867466.52006: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867466.52070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867466.52110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmplbqo8tot /root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/AnsiballZ_package_facts.py <<< 25039 1726867466.52125: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/AnsiballZ_package_facts.py" <<< 25039 1726867466.52181: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmplbqo8tot" to remote "/root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/AnsiballZ_package_facts.py" <<< 25039 1726867466.53602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867466.53633: stderr chunk (state=3): >>><<< 25039 1726867466.53636: stdout chunk (state=3): >>><<< 25039 1726867466.53650: done transferring module to remote 25039 1726867466.53659: _low_level_execute_command(): starting 25039 1726867466.53662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/ /root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/AnsiballZ_package_facts.py && sleep 0' 25039 1726867466.54046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867466.54054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867466.54071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867466.54074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867466.54137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867466.54140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867466.54179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867466.55964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867466.55969: stdout chunk (state=3): >>><<< 25039 1726867466.55972: stderr chunk (state=3): >>><<< 25039 1726867466.56055: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867466.56058: _low_level_execute_command(): starting 25039 1726867466.56061: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/AnsiballZ_package_facts.py && sleep 0' 25039 1726867466.56483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867466.56506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867466.56510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867466.56562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867466.56565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867466.56617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867467.00503: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 25039 1726867467.00524: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 25039 1726867467.00608: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 25039 1726867467.00665: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 25039 1726867467.00700: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25039 1726867467.02364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867467.02395: stderr chunk (state=3): >>><<< 25039 1726867467.02398: stdout chunk (state=3): >>><<< 25039 1726867467.02435: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867467.03728: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867467.03744: _low_level_execute_command(): starting 25039 1726867467.03748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867466.4563708-26132-212176108122868/ > /dev/null 2>&1 && sleep 0' 25039 1726867467.04136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867467.04164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867467.04167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867467.04169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867467.04171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867467.04173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867467.04233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867467.04235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867467.04287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867467.06282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867467.06285: stdout chunk (state=3): >>><<< 25039 1726867467.06287: stderr chunk (state=3): >>><<< 25039 1726867467.06290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867467.06292: handler run complete 25039 1726867467.07013: variable 'ansible_facts' from source: unknown 25039 1726867467.07523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.08584: variable 'ansible_facts' from source: unknown 25039 1726867467.08818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.09312: attempt loop complete, returning result 25039 1726867467.09315: _execute() done 25039 1726867467.09318: dumping result to json 25039 1726867467.09595: done dumping result, returning 25039 1726867467.09599: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-3ddc-7272-000000000519] 25039 1726867467.09601: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000519 25039 1726867467.11113: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000519 25039 1726867467.11117: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867467.11199: no more pending results, returning what we have 25039 1726867467.11201: results queue empty 25039 1726867467.11202: checking for any_errors_fatal 25039 1726867467.11205: done checking for any_errors_fatal 25039 1726867467.11205: checking for max_fail_percentage 25039 1726867467.11206: done checking for max_fail_percentage 25039 1726867467.11206: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.11207: done checking to see if all hosts have failed 25039 1726867467.11208: getting the remaining hosts for this loop 25039 1726867467.11209: done getting the remaining hosts for this loop 25039 1726867467.11211: getting the next task for host managed_node1 25039 1726867467.11216: done getting next task for host managed_node1 25039 1726867467.11219: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25039 1726867467.11225: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.11233: getting variables 25039 1726867467.11234: in VariableManager get_vars() 25039 1726867467.11259: Calling all_inventory to load vars for managed_node1 25039 1726867467.11261: Calling groups_inventory to load vars for managed_node1 25039 1726867467.11263: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.11269: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.11271: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.11272: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.11944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.13332: done with get_vars() 25039 1726867467.13355: done getting variables 25039 1726867467.13418: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:24:27 -0400 (0:00:00.728) 0:00:24.660 ****** 25039 1726867467.13458: entering _queue_task() for managed_node1/debug 25039 1726867467.13706: worker is 1 (out of 1 available) 25039 1726867467.13719: exiting _queue_task() for managed_node1/debug 25039 1726867467.13731: done queuing things up, now waiting for results queue to drain 25039 1726867467.13733: waiting for pending results... 25039 1726867467.13918: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 25039 1726867467.13998: in run() - task 0affcac9-a3a5-3ddc-7272-00000000006f 25039 1726867467.14008: variable 'ansible_search_path' from source: unknown 25039 1726867467.14014: variable 'ansible_search_path' from source: unknown 25039 1726867467.14043: calling self._execute() 25039 1726867467.14131: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.14135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.14145: variable 'omit' from source: magic vars 25039 1726867467.14421: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.14431: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.14437: variable 'omit' from source: magic vars 25039 1726867467.14474: variable 'omit' from source: magic vars 25039 1726867467.14548: variable 'network_provider' from source: set_fact 25039 1726867467.14562: variable 'omit' from source: magic vars 25039 1726867467.14597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867467.14626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867467.14642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867467.14654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867467.14665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867467.14690: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867467.14694: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.14696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.14766: Set connection var ansible_shell_executable to /bin/sh 25039 1726867467.14770: Set connection var ansible_timeout to 10 25039 1726867467.14776: Set connection var ansible_shell_type to sh 25039 1726867467.14781: Set connection var ansible_connection to ssh 25039 1726867467.14787: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867467.14792: Set connection var ansible_pipelining to False 25039 1726867467.14812: variable 'ansible_shell_executable' from source: unknown 25039 1726867467.14815: variable 'ansible_connection' from source: unknown 25039 1726867467.14818: variable 'ansible_module_compression' from source: unknown 25039 1726867467.14821: variable 'ansible_shell_type' from source: unknown 25039 1726867467.14826: variable 'ansible_shell_executable' from source: unknown 25039 1726867467.14829: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.14832: variable 'ansible_pipelining' from source: unknown 25039 1726867467.14834: variable 'ansible_timeout' from source: unknown 25039 1726867467.14836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.14930: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867467.14938: variable 'omit' from source: magic vars 25039 1726867467.14943: starting attempt loop 25039 1726867467.14948: running the handler 25039 1726867467.14985: handler run complete 25039 1726867467.14996: attempt loop complete, returning result 25039 1726867467.14999: _execute() done 25039 1726867467.15002: dumping result to json 25039 1726867467.15005: done dumping result, returning 25039 1726867467.15013: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-3ddc-7272-00000000006f] 25039 1726867467.15015: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000006f 25039 1726867467.15094: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000006f 25039 1726867467.15097: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 25039 1726867467.15186: no more pending results, returning what we have 25039 1726867467.15189: results queue empty 25039 1726867467.15190: checking for any_errors_fatal 25039 1726867467.15198: done checking for any_errors_fatal 25039 1726867467.15199: checking for max_fail_percentage 25039 1726867467.15200: done checking for max_fail_percentage 25039 1726867467.15201: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.15202: done checking to see if all hosts have failed 25039 1726867467.15203: getting the remaining hosts for this loop 25039 1726867467.15204: done getting the remaining hosts for this loop 25039 1726867467.15207: getting the next task for host managed_node1 25039 1726867467.15216: done getting next task for host managed_node1 25039 1726867467.15267: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25039 1726867467.15271: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.15332: getting variables 25039 1726867467.15334: in VariableManager get_vars() 25039 1726867467.15384: Calling all_inventory to load vars for managed_node1 25039 1726867467.15387: Calling groups_inventory to load vars for managed_node1 25039 1726867467.15389: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.15395: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.15397: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.15399: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.19949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.20868: done with get_vars() 25039 1726867467.20884: done getting variables 25039 1726867467.20919: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:24:27 -0400 (0:00:00.074) 0:00:24.735 ****** 25039 1726867467.20940: entering _queue_task() for managed_node1/fail 25039 1726867467.21186: worker is 1 (out of 1 available) 25039 1726867467.21199: exiting _queue_task() for managed_node1/fail 25039 1726867467.21209: done queuing things up, now waiting for results queue to drain 25039 1726867467.21211: waiting for pending results... 25039 1726867467.21393: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25039 1726867467.21493: in run() - task 0affcac9-a3a5-3ddc-7272-000000000070 25039 1726867467.21503: variable 'ansible_search_path' from source: unknown 25039 1726867467.21507: variable 'ansible_search_path' from source: unknown 25039 1726867467.21537: calling self._execute() 25039 1726867467.21610: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.21618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.21627: variable 'omit' from source: magic vars 25039 1726867467.21908: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.21920: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.22004: variable 'network_state' from source: role '' defaults 25039 1726867467.22016: Evaluated conditional (network_state != {}): False 25039 1726867467.22019: when evaluation is False, skipping this task 25039 1726867467.22023: _execute() done 25039 1726867467.22026: dumping result to json 25039 1726867467.22029: done dumping result, returning 25039 1726867467.22035: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-3ddc-7272-000000000070] 25039 1726867467.22040: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000070 25039 1726867467.22122: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000070 25039 1726867467.22125: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867467.22167: no more pending results, returning what we have 25039 1726867467.22170: results queue empty 25039 1726867467.22171: checking for any_errors_fatal 25039 1726867467.22180: done checking for any_errors_fatal 25039 1726867467.22180: checking for max_fail_percentage 25039 1726867467.22182: done checking for max_fail_percentage 25039 1726867467.22183: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.22184: done checking to see if all hosts have failed 25039 1726867467.22185: getting the remaining hosts for this loop 25039 1726867467.22186: done getting the remaining hosts for this loop 25039 1726867467.22189: getting the next task for host managed_node1 25039 1726867467.22197: done getting next task for host managed_node1 25039 1726867467.22200: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25039 1726867467.22203: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.22221: getting variables 25039 1726867467.22223: in VariableManager get_vars() 25039 1726867467.22259: Calling all_inventory to load vars for managed_node1 25039 1726867467.22261: Calling groups_inventory to load vars for managed_node1 25039 1726867467.22263: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.22272: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.22274: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.22276: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.23012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.23873: done with get_vars() 25039 1726867467.23889: done getting variables 25039 1726867467.23930: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:24:27 -0400 (0:00:00.030) 0:00:24.765 ****** 25039 1726867467.23952: entering _queue_task() for managed_node1/fail 25039 1726867467.24148: worker is 1 (out of 1 available) 25039 1726867467.24161: exiting _queue_task() for managed_node1/fail 25039 1726867467.24173: done queuing things up, now waiting for results queue to drain 25039 1726867467.24174: waiting for pending results... 25039 1726867467.24335: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25039 1726867467.24431: in run() - task 0affcac9-a3a5-3ddc-7272-000000000071 25039 1726867467.24441: variable 'ansible_search_path' from source: unknown 25039 1726867467.24445: variable 'ansible_search_path' from source: unknown 25039 1726867467.24474: calling self._execute() 25039 1726867467.24550: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.24554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.24563: variable 'omit' from source: magic vars 25039 1726867467.24828: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.24841: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.24920: variable 'network_state' from source: role '' defaults 25039 1726867467.24930: Evaluated conditional (network_state != {}): False 25039 1726867467.24933: when evaluation is False, skipping this task 25039 1726867467.24936: _execute() done 25039 1726867467.24940: dumping result to json 25039 1726867467.24943: done dumping result, returning 25039 1726867467.24947: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-3ddc-7272-000000000071] 25039 1726867467.24959: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000071 25039 1726867467.25032: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000071 25039 1726867467.25034: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867467.25103: no more pending results, returning what we have 25039 1726867467.25106: results queue empty 25039 1726867467.25107: checking for any_errors_fatal 25039 1726867467.25113: done checking for any_errors_fatal 25039 1726867467.25113: checking for max_fail_percentage 25039 1726867467.25115: done checking for max_fail_percentage 25039 1726867467.25115: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.25116: done checking to see if all hosts have failed 25039 1726867467.25117: getting the remaining hosts for this loop 25039 1726867467.25118: done getting the remaining hosts for this loop 25039 1726867467.25121: getting the next task for host managed_node1 25039 1726867467.25126: done getting next task for host managed_node1 25039 1726867467.25128: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25039 1726867467.25131: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.25145: getting variables 25039 1726867467.25146: in VariableManager get_vars() 25039 1726867467.25179: Calling all_inventory to load vars for managed_node1 25039 1726867467.25182: Calling groups_inventory to load vars for managed_node1 25039 1726867467.25184: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.25192: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.25194: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.25196: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.26001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.26865: done with get_vars() 25039 1726867467.26880: done getting variables 25039 1726867467.26920: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:24:27 -0400 (0:00:00.029) 0:00:24.795 ****** 25039 1726867467.26943: entering _queue_task() for managed_node1/fail 25039 1726867467.27124: worker is 1 (out of 1 available) 25039 1726867467.27137: exiting _queue_task() for managed_node1/fail 25039 1726867467.27147: done queuing things up, now waiting for results queue to drain 25039 1726867467.27149: waiting for pending results... 25039 1726867467.27299: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25039 1726867467.27373: in run() - task 0affcac9-a3a5-3ddc-7272-000000000072 25039 1726867467.27387: variable 'ansible_search_path' from source: unknown 25039 1726867467.27390: variable 'ansible_search_path' from source: unknown 25039 1726867467.27413: calling self._execute() 25039 1726867467.27481: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.27494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.27498: variable 'omit' from source: magic vars 25039 1726867467.27743: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.27751: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.27863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867467.29311: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867467.29361: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867467.29389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867467.29457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867467.29461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867467.29489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.29513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.29529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.29556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.29568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.29632: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.29642: Evaluated conditional (ansible_distribution_major_version | int > 9): True 25039 1726867467.29720: variable 'ansible_distribution' from source: facts 25039 1726867467.29724: variable '__network_rh_distros' from source: role '' defaults 25039 1726867467.29732: Evaluated conditional (ansible_distribution in __network_rh_distros): True 25039 1726867467.29886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.29904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.29922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.29950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.29960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.29996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.30015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.30031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.30055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.30066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.30096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.30119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.30133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.30156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.30167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.30349: variable 'network_connections' from source: task vars 25039 1726867467.30358: variable 'interface' from source: play vars 25039 1726867467.30404: variable 'interface' from source: play vars 25039 1726867467.30447: variable 'network_state' from source: role '' defaults 25039 1726867467.30462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867467.30582: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867467.30618: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867467.30640: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867467.30663: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867467.30695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867467.30710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867467.30733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.30751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867467.30769: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 25039 1726867467.30779: when evaluation is False, skipping this task 25039 1726867467.30782: _execute() done 25039 1726867467.30785: dumping result to json 25039 1726867467.30790: done dumping result, returning 25039 1726867467.30797: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-3ddc-7272-000000000072] 25039 1726867467.30800: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000072 25039 1726867467.30876: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000072 25039 1726867467.30880: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 25039 1726867467.30924: no more pending results, returning what we have 25039 1726867467.30927: results queue empty 25039 1726867467.30928: checking for any_errors_fatal 25039 1726867467.30933: done checking for any_errors_fatal 25039 1726867467.30934: checking for max_fail_percentage 25039 1726867467.30935: done checking for max_fail_percentage 25039 1726867467.30936: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.30937: done checking to see if all hosts have failed 25039 1726867467.30938: getting the remaining hosts for this loop 25039 1726867467.30939: done getting the remaining hosts for this loop 25039 1726867467.30943: getting the next task for host managed_node1 25039 1726867467.30949: done getting next task for host managed_node1 25039 1726867467.30952: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25039 1726867467.30954: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.30969: getting variables 25039 1726867467.30971: in VariableManager get_vars() 25039 1726867467.31006: Calling all_inventory to load vars for managed_node1 25039 1726867467.31011: Calling groups_inventory to load vars for managed_node1 25039 1726867467.31013: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.31021: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.31024: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.31026: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.31755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.32721: done with get_vars() 25039 1726867467.32735: done getting variables 25039 1726867467.32772: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:24:27 -0400 (0:00:00.058) 0:00:24.853 ****** 25039 1726867467.32794: entering _queue_task() for managed_node1/dnf 25039 1726867467.32994: worker is 1 (out of 1 available) 25039 1726867467.33010: exiting _queue_task() for managed_node1/dnf 25039 1726867467.33022: done queuing things up, now waiting for results queue to drain 25039 1726867467.33023: waiting for pending results... 25039 1726867467.33183: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25039 1726867467.33252: in run() - task 0affcac9-a3a5-3ddc-7272-000000000073 25039 1726867467.33267: variable 'ansible_search_path' from source: unknown 25039 1726867467.33270: variable 'ansible_search_path' from source: unknown 25039 1726867467.33298: calling self._execute() 25039 1726867467.33374: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.33380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.33383: variable 'omit' from source: magic vars 25039 1726867467.33636: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.33645: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.33773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867467.35205: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867467.35255: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867467.35281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867467.35307: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867467.35330: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867467.35383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.35402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.35422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.35450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.35460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.35535: variable 'ansible_distribution' from source: facts 25039 1726867467.35539: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.35549: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25039 1726867467.35621: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867467.35705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.35723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.35739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.35765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.35776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.35805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.35821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.35837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.35861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.35874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.35918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.35947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.35951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.35979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.35991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.36085: variable 'network_connections' from source: task vars 25039 1726867467.36094: variable 'interface' from source: play vars 25039 1726867467.36139: variable 'interface' from source: play vars 25039 1726867467.36187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867467.36303: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867467.36333: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867467.36354: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867467.36374: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867467.36405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867467.36424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867467.36445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.36463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867467.36500: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867467.36650: variable 'network_connections' from source: task vars 25039 1726867467.36654: variable 'interface' from source: play vars 25039 1726867467.36698: variable 'interface' from source: play vars 25039 1726867467.36718: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25039 1726867467.36722: when evaluation is False, skipping this task 25039 1726867467.36724: _execute() done 25039 1726867467.36727: dumping result to json 25039 1726867467.36729: done dumping result, returning 25039 1726867467.36735: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-3ddc-7272-000000000073] 25039 1726867467.36740: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000073 25039 1726867467.36821: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000073 25039 1726867467.36824: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25039 1726867467.36901: no more pending results, returning what we have 25039 1726867467.36904: results queue empty 25039 1726867467.36905: checking for any_errors_fatal 25039 1726867467.36911: done checking for any_errors_fatal 25039 1726867467.36912: checking for max_fail_percentage 25039 1726867467.36913: done checking for max_fail_percentage 25039 1726867467.36914: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.36915: done checking to see if all hosts have failed 25039 1726867467.36916: getting the remaining hosts for this loop 25039 1726867467.36917: done getting the remaining hosts for this loop 25039 1726867467.36920: getting the next task for host managed_node1 25039 1726867467.36925: done getting next task for host managed_node1 25039 1726867467.36927: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25039 1726867467.36930: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.36946: getting variables 25039 1726867467.36947: in VariableManager get_vars() 25039 1726867467.36980: Calling all_inventory to load vars for managed_node1 25039 1726867467.36982: Calling groups_inventory to load vars for managed_node1 25039 1726867467.36984: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.36992: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.36994: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.36997: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.37940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.39181: done with get_vars() 25039 1726867467.39196: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25039 1726867467.39248: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:24:27 -0400 (0:00:00.064) 0:00:24.918 ****** 25039 1726867467.39271: entering _queue_task() for managed_node1/yum 25039 1726867467.39473: worker is 1 (out of 1 available) 25039 1726867467.39489: exiting _queue_task() for managed_node1/yum 25039 1726867467.39501: done queuing things up, now waiting for results queue to drain 25039 1726867467.39503: waiting for pending results... 25039 1726867467.39680: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25039 1726867467.39766: in run() - task 0affcac9-a3a5-3ddc-7272-000000000074 25039 1726867467.39778: variable 'ansible_search_path' from source: unknown 25039 1726867467.39782: variable 'ansible_search_path' from source: unknown 25039 1726867467.39811: calling self._execute() 25039 1726867467.39881: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.39886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.39894: variable 'omit' from source: magic vars 25039 1726867467.40155: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.40164: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.40278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867467.43227: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867467.43384: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867467.43387: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867467.43418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867467.43461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867467.43558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.43597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.43630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.43665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.43679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.43749: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.43768: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25039 1726867467.43771: when evaluation is False, skipping this task 25039 1726867467.43774: _execute() done 25039 1726867467.43776: dumping result to json 25039 1726867467.43780: done dumping result, returning 25039 1726867467.43783: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-3ddc-7272-000000000074] 25039 1726867467.43792: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000074 25039 1726867467.43876: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000074 25039 1726867467.43880: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25039 1726867467.43928: no more pending results, returning what we have 25039 1726867467.43931: results queue empty 25039 1726867467.43932: checking for any_errors_fatal 25039 1726867467.43940: done checking for any_errors_fatal 25039 1726867467.43940: checking for max_fail_percentage 25039 1726867467.43942: done checking for max_fail_percentage 25039 1726867467.43943: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.43944: done checking to see if all hosts have failed 25039 1726867467.43945: getting the remaining hosts for this loop 25039 1726867467.43946: done getting the remaining hosts for this loop 25039 1726867467.43949: getting the next task for host managed_node1 25039 1726867467.43956: done getting next task for host managed_node1 25039 1726867467.43960: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25039 1726867467.43962: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.43985: getting variables 25039 1726867467.43987: in VariableManager get_vars() 25039 1726867467.44027: Calling all_inventory to load vars for managed_node1 25039 1726867467.44030: Calling groups_inventory to load vars for managed_node1 25039 1726867467.44032: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.44040: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.44043: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.44045: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.45004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.47095: done with get_vars() 25039 1726867467.47122: done getting variables 25039 1726867467.47180: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:24:27 -0400 (0:00:00.079) 0:00:24.997 ****** 25039 1726867467.47214: entering _queue_task() for managed_node1/fail 25039 1726867467.47516: worker is 1 (out of 1 available) 25039 1726867467.47530: exiting _queue_task() for managed_node1/fail 25039 1726867467.47542: done queuing things up, now waiting for results queue to drain 25039 1726867467.47544: waiting for pending results... 25039 1726867467.47904: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25039 1726867467.47954: in run() - task 0affcac9-a3a5-3ddc-7272-000000000075 25039 1726867467.47973: variable 'ansible_search_path' from source: unknown 25039 1726867467.47983: variable 'ansible_search_path' from source: unknown 25039 1726867467.48024: calling self._execute() 25039 1726867467.48129: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.48140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.48182: variable 'omit' from source: magic vars 25039 1726867467.48532: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.48552: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.48682: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867467.48879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867467.52184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867467.52436: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867467.52440: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867467.52443: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867467.52484: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867467.52762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.52766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.52769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.52843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.52867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.52964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.53004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.53035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.53079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.53383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.53387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.53404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.53576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.53651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.53670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.53855: variable 'network_connections' from source: task vars 25039 1726867467.53876: variable 'interface' from source: play vars 25039 1726867467.53948: variable 'interface' from source: play vars 25039 1726867467.54038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867467.54222: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867467.54272: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867467.54316: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867467.54348: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867467.54404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867467.54426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867467.54514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.54518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867467.54549: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867467.54815: variable 'network_connections' from source: task vars 25039 1726867467.54825: variable 'interface' from source: play vars 25039 1726867467.54897: variable 'interface' from source: play vars 25039 1726867467.54925: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25039 1726867467.54933: when evaluation is False, skipping this task 25039 1726867467.54940: _execute() done 25039 1726867467.54954: dumping result to json 25039 1726867467.54982: done dumping result, returning 25039 1726867467.54985: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-3ddc-7272-000000000075] 25039 1726867467.54988: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000075 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25039 1726867467.55260: no more pending results, returning what we have 25039 1726867467.55264: results queue empty 25039 1726867467.55265: checking for any_errors_fatal 25039 1726867467.55274: done checking for any_errors_fatal 25039 1726867467.55276: checking for max_fail_percentage 25039 1726867467.55280: done checking for max_fail_percentage 25039 1726867467.55281: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.55282: done checking to see if all hosts have failed 25039 1726867467.55283: getting the remaining hosts for this loop 25039 1726867467.55284: done getting the remaining hosts for this loop 25039 1726867467.55288: getting the next task for host managed_node1 25039 1726867467.55296: done getting next task for host managed_node1 25039 1726867467.55299: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25039 1726867467.55302: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.55321: getting variables 25039 1726867467.55323: in VariableManager get_vars() 25039 1726867467.55362: Calling all_inventory to load vars for managed_node1 25039 1726867467.55365: Calling groups_inventory to load vars for managed_node1 25039 1726867467.55368: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.55433: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.55440: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.55499: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.56240: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000075 25039 1726867467.56243: WORKER PROCESS EXITING 25039 1726867467.58027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.60388: done with get_vars() 25039 1726867467.60412: done getting variables 25039 1726867467.60548: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:24:27 -0400 (0:00:00.133) 0:00:25.132 ****** 25039 1726867467.60675: entering _queue_task() for managed_node1/package 25039 1726867467.61111: worker is 1 (out of 1 available) 25039 1726867467.61125: exiting _queue_task() for managed_node1/package 25039 1726867467.61143: done queuing things up, now waiting for results queue to drain 25039 1726867467.61145: waiting for pending results... 25039 1726867467.61341: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 25039 1726867467.61466: in run() - task 0affcac9-a3a5-3ddc-7272-000000000076 25039 1726867467.61487: variable 'ansible_search_path' from source: unknown 25039 1726867467.61494: variable 'ansible_search_path' from source: unknown 25039 1726867467.61532: calling self._execute() 25039 1726867467.61629: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.61640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.61654: variable 'omit' from source: magic vars 25039 1726867467.62011: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.62029: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.62215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867467.62469: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867467.62519: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867467.62554: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867467.62623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867467.62733: variable 'network_packages' from source: role '' defaults 25039 1726867467.62837: variable '__network_provider_setup' from source: role '' defaults 25039 1726867467.62857: variable '__network_service_name_default_nm' from source: role '' defaults 25039 1726867467.62923: variable '__network_service_name_default_nm' from source: role '' defaults 25039 1726867467.62940: variable '__network_packages_default_nm' from source: role '' defaults 25039 1726867467.63003: variable '__network_packages_default_nm' from source: role '' defaults 25039 1726867467.63164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867467.65348: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867467.65417: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867467.65683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867467.65686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867467.65688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867467.65690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.65692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.65694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.65696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.65697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.65738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.65763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.65791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.65830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.65846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.66063: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25039 1726867467.66176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.66222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.66252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.66298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.66318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.66412: variable 'ansible_python' from source: facts 25039 1726867467.66446: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25039 1726867467.66540: variable '__network_wpa_supplicant_required' from source: role '' defaults 25039 1726867467.66622: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25039 1726867467.66754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.66788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.66820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.66865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.66887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.66944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.66986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.67022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.67066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.67092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.67238: variable 'network_connections' from source: task vars 25039 1726867467.67249: variable 'interface' from source: play vars 25039 1726867467.67353: variable 'interface' from source: play vars 25039 1726867467.67429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867467.67464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867467.67502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.67538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867467.67596: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867467.67857: variable 'network_connections' from source: task vars 25039 1726867467.67886: variable 'interface' from source: play vars 25039 1726867467.67971: variable 'interface' from source: play vars 25039 1726867467.68084: variable '__network_packages_default_wireless' from source: role '' defaults 25039 1726867467.68096: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867467.68382: variable 'network_connections' from source: task vars 25039 1726867467.68393: variable 'interface' from source: play vars 25039 1726867467.68462: variable 'interface' from source: play vars 25039 1726867467.68491: variable '__network_packages_default_team' from source: role '' defaults 25039 1726867467.68571: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867467.68975: variable 'network_connections' from source: task vars 25039 1726867467.68981: variable 'interface' from source: play vars 25039 1726867467.68983: variable 'interface' from source: play vars 25039 1726867467.69020: variable '__network_service_name_default_initscripts' from source: role '' defaults 25039 1726867467.69086: variable '__network_service_name_default_initscripts' from source: role '' defaults 25039 1726867467.69099: variable '__network_packages_default_initscripts' from source: role '' defaults 25039 1726867467.69158: variable '__network_packages_default_initscripts' from source: role '' defaults 25039 1726867467.69373: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25039 1726867467.69858: variable 'network_connections' from source: task vars 25039 1726867467.69869: variable 'interface' from source: play vars 25039 1726867467.69931: variable 'interface' from source: play vars 25039 1726867467.69944: variable 'ansible_distribution' from source: facts 25039 1726867467.69956: variable '__network_rh_distros' from source: role '' defaults 25039 1726867467.69966: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.69987: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25039 1726867467.70157: variable 'ansible_distribution' from source: facts 25039 1726867467.70170: variable '__network_rh_distros' from source: role '' defaults 25039 1726867467.70182: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.70283: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25039 1726867467.70358: variable 'ansible_distribution' from source: facts 25039 1726867467.70367: variable '__network_rh_distros' from source: role '' defaults 25039 1726867467.70376: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.70421: variable 'network_provider' from source: set_fact 25039 1726867467.70440: variable 'ansible_facts' from source: unknown 25039 1726867467.71126: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25039 1726867467.71134: when evaluation is False, skipping this task 25039 1726867467.71142: _execute() done 25039 1726867467.71150: dumping result to json 25039 1726867467.71163: done dumping result, returning 25039 1726867467.71176: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-3ddc-7272-000000000076] 25039 1726867467.71188: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000076 25039 1726867467.71341: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000076 25039 1726867467.71344: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25039 1726867467.71425: no more pending results, returning what we have 25039 1726867467.71429: results queue empty 25039 1726867467.71430: checking for any_errors_fatal 25039 1726867467.71437: done checking for any_errors_fatal 25039 1726867467.71438: checking for max_fail_percentage 25039 1726867467.71440: done checking for max_fail_percentage 25039 1726867467.71441: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.71442: done checking to see if all hosts have failed 25039 1726867467.71443: getting the remaining hosts for this loop 25039 1726867467.71445: done getting the remaining hosts for this loop 25039 1726867467.71449: getting the next task for host managed_node1 25039 1726867467.71457: done getting next task for host managed_node1 25039 1726867467.71460: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25039 1726867467.71463: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.71484: getting variables 25039 1726867467.71486: in VariableManager get_vars() 25039 1726867467.71526: Calling all_inventory to load vars for managed_node1 25039 1726867467.71529: Calling groups_inventory to load vars for managed_node1 25039 1726867467.71532: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.71543: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.71546: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.71550: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.73260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.74827: done with get_vars() 25039 1726867467.74850: done getting variables 25039 1726867467.74923: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:24:27 -0400 (0:00:00.142) 0:00:25.275 ****** 25039 1726867467.74960: entering _queue_task() for managed_node1/package 25039 1726867467.75315: worker is 1 (out of 1 available) 25039 1726867467.75327: exiting _queue_task() for managed_node1/package 25039 1726867467.75339: done queuing things up, now waiting for results queue to drain 25039 1726867467.75341: waiting for pending results... 25039 1726867467.75707: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25039 1726867467.75776: in run() - task 0affcac9-a3a5-3ddc-7272-000000000077 25039 1726867467.75806: variable 'ansible_search_path' from source: unknown 25039 1726867467.75915: variable 'ansible_search_path' from source: unknown 25039 1726867467.75918: calling self._execute() 25039 1726867467.75957: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.75970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.75987: variable 'omit' from source: magic vars 25039 1726867467.76387: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.76406: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.76545: variable 'network_state' from source: role '' defaults 25039 1726867467.76568: Evaluated conditional (network_state != {}): False 25039 1726867467.76579: when evaluation is False, skipping this task 25039 1726867467.76587: _execute() done 25039 1726867467.76594: dumping result to json 25039 1726867467.76601: done dumping result, returning 25039 1726867467.76615: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-3ddc-7272-000000000077] 25039 1726867467.76625: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000077 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867467.76888: no more pending results, returning what we have 25039 1726867467.76893: results queue empty 25039 1726867467.76894: checking for any_errors_fatal 25039 1726867467.76902: done checking for any_errors_fatal 25039 1726867467.76903: checking for max_fail_percentage 25039 1726867467.76905: done checking for max_fail_percentage 25039 1726867467.76906: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.76910: done checking to see if all hosts have failed 25039 1726867467.76910: getting the remaining hosts for this loop 25039 1726867467.76912: done getting the remaining hosts for this loop 25039 1726867467.76919: getting the next task for host managed_node1 25039 1726867467.76927: done getting next task for host managed_node1 25039 1726867467.76933: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25039 1726867467.76936: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.76957: getting variables 25039 1726867467.76959: in VariableManager get_vars() 25039 1726867467.77001: Calling all_inventory to load vars for managed_node1 25039 1726867467.77004: Calling groups_inventory to load vars for managed_node1 25039 1726867467.77006: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.77019: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.77022: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.77025: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.77594: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000077 25039 1726867467.77597: WORKER PROCESS EXITING 25039 1726867467.77843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.78966: done with get_vars() 25039 1726867467.78988: done getting variables 25039 1726867467.79203: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:24:27 -0400 (0:00:00.042) 0:00:25.317 ****** 25039 1726867467.79235: entering _queue_task() for managed_node1/package 25039 1726867467.79646: worker is 1 (out of 1 available) 25039 1726867467.79659: exiting _queue_task() for managed_node1/package 25039 1726867467.79671: done queuing things up, now waiting for results queue to drain 25039 1726867467.79673: waiting for pending results... 25039 1726867467.79853: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25039 1726867467.79937: in run() - task 0affcac9-a3a5-3ddc-7272-000000000078 25039 1726867467.79949: variable 'ansible_search_path' from source: unknown 25039 1726867467.79952: variable 'ansible_search_path' from source: unknown 25039 1726867467.79980: calling self._execute() 25039 1726867467.80050: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.80054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.80063: variable 'omit' from source: magic vars 25039 1726867467.80485: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.80488: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.80491: variable 'network_state' from source: role '' defaults 25039 1726867467.80493: Evaluated conditional (network_state != {}): False 25039 1726867467.80495: when evaluation is False, skipping this task 25039 1726867467.80498: _execute() done 25039 1726867467.80500: dumping result to json 25039 1726867467.80502: done dumping result, returning 25039 1726867467.80504: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-3ddc-7272-000000000078] 25039 1726867467.80507: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000078 25039 1726867467.80568: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000078 25039 1726867467.80571: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867467.80634: no more pending results, returning what we have 25039 1726867467.80638: results queue empty 25039 1726867467.80638: checking for any_errors_fatal 25039 1726867467.80643: done checking for any_errors_fatal 25039 1726867467.80644: checking for max_fail_percentage 25039 1726867467.80645: done checking for max_fail_percentage 25039 1726867467.80646: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.80646: done checking to see if all hosts have failed 25039 1726867467.80647: getting the remaining hosts for this loop 25039 1726867467.80648: done getting the remaining hosts for this loop 25039 1726867467.80651: getting the next task for host managed_node1 25039 1726867467.80656: done getting next task for host managed_node1 25039 1726867467.80659: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25039 1726867467.80662: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.80676: getting variables 25039 1726867467.80717: in VariableManager get_vars() 25039 1726867467.80747: Calling all_inventory to load vars for managed_node1 25039 1726867467.80749: Calling groups_inventory to load vars for managed_node1 25039 1726867467.80751: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.80759: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.80761: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.80764: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.81866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.83329: done with get_vars() 25039 1726867467.83349: done getting variables 25039 1726867467.83408: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:24:27 -0400 (0:00:00.042) 0:00:25.360 ****** 25039 1726867467.83440: entering _queue_task() for managed_node1/service 25039 1726867467.83688: worker is 1 (out of 1 available) 25039 1726867467.83699: exiting _queue_task() for managed_node1/service 25039 1726867467.83711: done queuing things up, now waiting for results queue to drain 25039 1726867467.83712: waiting for pending results... 25039 1726867467.83987: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25039 1726867467.84128: in run() - task 0affcac9-a3a5-3ddc-7272-000000000079 25039 1726867467.84149: variable 'ansible_search_path' from source: unknown 25039 1726867467.84158: variable 'ansible_search_path' from source: unknown 25039 1726867467.84200: calling self._execute() 25039 1726867467.84321: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.84325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.84328: variable 'omit' from source: magic vars 25039 1726867467.84690: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.84707: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.84866: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867467.85028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867467.87166: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867467.87243: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867467.87292: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867467.87583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867467.87586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867467.87588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.87591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.87592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.87594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.87596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.87605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.87632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.87658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.87698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.87720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.87760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.87787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.87821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.87865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.87887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.88058: variable 'network_connections' from source: task vars 25039 1726867467.88072: variable 'interface' from source: play vars 25039 1726867467.88133: variable 'interface' from source: play vars 25039 1726867467.88206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867467.88373: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867467.88425: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867467.88460: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867467.88498: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867467.88537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867467.88558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867467.88591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.88622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867467.88670: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867467.88906: variable 'network_connections' from source: task vars 25039 1726867467.88916: variable 'interface' from source: play vars 25039 1726867467.88971: variable 'interface' from source: play vars 25039 1726867467.88998: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25039 1726867467.89010: when evaluation is False, skipping this task 25039 1726867467.89016: _execute() done 25039 1726867467.89023: dumping result to json 25039 1726867467.89030: done dumping result, returning 25039 1726867467.89041: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-3ddc-7272-000000000079] 25039 1726867467.89050: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000079 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25039 1726867467.89441: no more pending results, returning what we have 25039 1726867467.89445: results queue empty 25039 1726867467.89445: checking for any_errors_fatal 25039 1726867467.89451: done checking for any_errors_fatal 25039 1726867467.89452: checking for max_fail_percentage 25039 1726867467.89453: done checking for max_fail_percentage 25039 1726867467.89454: checking to see if all hosts have failed and the running result is not ok 25039 1726867467.89455: done checking to see if all hosts have failed 25039 1726867467.89456: getting the remaining hosts for this loop 25039 1726867467.89458: done getting the remaining hosts for this loop 25039 1726867467.89461: getting the next task for host managed_node1 25039 1726867467.89468: done getting next task for host managed_node1 25039 1726867467.89471: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25039 1726867467.89474: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867467.89493: getting variables 25039 1726867467.89495: in VariableManager get_vars() 25039 1726867467.89533: Calling all_inventory to load vars for managed_node1 25039 1726867467.89535: Calling groups_inventory to load vars for managed_node1 25039 1726867467.89537: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867467.89547: Calling all_plugins_play to load vars for managed_node1 25039 1726867467.89550: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867467.89552: Calling groups_plugins_play to load vars for managed_node1 25039 1726867467.90283: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000079 25039 1726867467.90286: WORKER PROCESS EXITING 25039 1726867467.91004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867467.91949: done with get_vars() 25039 1726867467.91963: done getting variables 25039 1726867467.92004: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:24:27 -0400 (0:00:00.085) 0:00:25.445 ****** 25039 1726867467.92028: entering _queue_task() for managed_node1/service 25039 1726867467.92240: worker is 1 (out of 1 available) 25039 1726867467.92253: exiting _queue_task() for managed_node1/service 25039 1726867467.92265: done queuing things up, now waiting for results queue to drain 25039 1726867467.92267: waiting for pending results... 25039 1726867467.92465: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25039 1726867467.92600: in run() - task 0affcac9-a3a5-3ddc-7272-00000000007a 25039 1726867467.92623: variable 'ansible_search_path' from source: unknown 25039 1726867467.92638: variable 'ansible_search_path' from source: unknown 25039 1726867467.92676: calling self._execute() 25039 1726867467.92788: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.92805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.92819: variable 'omit' from source: magic vars 25039 1726867467.93190: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.93205: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867467.93382: variable 'network_provider' from source: set_fact 25039 1726867467.93385: variable 'network_state' from source: role '' defaults 25039 1726867467.93395: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25039 1726867467.93401: variable 'omit' from source: magic vars 25039 1726867467.93438: variable 'omit' from source: magic vars 25039 1726867467.93465: variable 'network_service_name' from source: role '' defaults 25039 1726867467.93522: variable 'network_service_name' from source: role '' defaults 25039 1726867467.93595: variable '__network_provider_setup' from source: role '' defaults 25039 1726867467.93598: variable '__network_service_name_default_nm' from source: role '' defaults 25039 1726867467.93642: variable '__network_service_name_default_nm' from source: role '' defaults 25039 1726867467.93650: variable '__network_packages_default_nm' from source: role '' defaults 25039 1726867467.93696: variable '__network_packages_default_nm' from source: role '' defaults 25039 1726867467.93832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867467.95220: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867467.95285: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867467.95312: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867467.95339: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867467.95417: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867467.95459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.95472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.95695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.95698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.95701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.95703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.95706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.95710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.95713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.95715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.95896: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25039 1726867467.96001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.96025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.96048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.96088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.96103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.96187: variable 'ansible_python' from source: facts 25039 1726867467.96206: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25039 1726867467.96279: variable '__network_wpa_supplicant_required' from source: role '' defaults 25039 1726867467.96349: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25039 1726867467.96462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.96485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.96512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.96546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.96560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.96606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867467.96629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867467.96655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.96691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867467.96705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867467.96830: variable 'network_connections' from source: task vars 25039 1726867467.96838: variable 'interface' from source: play vars 25039 1726867467.96915: variable 'interface' from source: play vars 25039 1726867467.97002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867467.97149: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867467.97184: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867467.97215: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867467.97253: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867467.97290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867467.97310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867467.97334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867467.97358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867467.97395: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867467.97574: variable 'network_connections' from source: task vars 25039 1726867467.97584: variable 'interface' from source: play vars 25039 1726867467.97636: variable 'interface' from source: play vars 25039 1726867467.97659: variable '__network_packages_default_wireless' from source: role '' defaults 25039 1726867467.97718: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867467.97897: variable 'network_connections' from source: task vars 25039 1726867467.97900: variable 'interface' from source: play vars 25039 1726867467.97953: variable 'interface' from source: play vars 25039 1726867467.97969: variable '__network_packages_default_team' from source: role '' defaults 25039 1726867467.98027: variable '__network_team_connections_defined' from source: role '' defaults 25039 1726867467.98205: variable 'network_connections' from source: task vars 25039 1726867467.98208: variable 'interface' from source: play vars 25039 1726867467.98260: variable 'interface' from source: play vars 25039 1726867467.98297: variable '__network_service_name_default_initscripts' from source: role '' defaults 25039 1726867467.98342: variable '__network_service_name_default_initscripts' from source: role '' defaults 25039 1726867467.98345: variable '__network_packages_default_initscripts' from source: role '' defaults 25039 1726867467.98388: variable '__network_packages_default_initscripts' from source: role '' defaults 25039 1726867467.98536: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25039 1726867467.99096: variable 'network_connections' from source: task vars 25039 1726867467.99099: variable 'interface' from source: play vars 25039 1726867467.99102: variable 'interface' from source: play vars 25039 1726867467.99104: variable 'ansible_distribution' from source: facts 25039 1726867467.99106: variable '__network_rh_distros' from source: role '' defaults 25039 1726867467.99108: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.99110: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25039 1726867467.99281: variable 'ansible_distribution' from source: facts 25039 1726867467.99285: variable '__network_rh_distros' from source: role '' defaults 25039 1726867467.99290: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.99327: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25039 1726867467.99460: variable 'ansible_distribution' from source: facts 25039 1726867467.99469: variable '__network_rh_distros' from source: role '' defaults 25039 1726867467.99481: variable 'ansible_distribution_major_version' from source: facts 25039 1726867467.99584: variable 'network_provider' from source: set_fact 25039 1726867467.99587: variable 'omit' from source: magic vars 25039 1726867467.99589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867467.99612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867467.99638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867467.99660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867467.99676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867467.99736: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867467.99746: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867467.99755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867467.99855: Set connection var ansible_shell_executable to /bin/sh 25039 1726867467.99868: Set connection var ansible_timeout to 10 25039 1726867467.99882: Set connection var ansible_shell_type to sh 25039 1726867467.99986: Set connection var ansible_connection to ssh 25039 1726867467.99989: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867467.99991: Set connection var ansible_pipelining to False 25039 1726867467.99994: variable 'ansible_shell_executable' from source: unknown 25039 1726867467.99996: variable 'ansible_connection' from source: unknown 25039 1726867467.99997: variable 'ansible_module_compression' from source: unknown 25039 1726867467.99999: variable 'ansible_shell_type' from source: unknown 25039 1726867468.00001: variable 'ansible_shell_executable' from source: unknown 25039 1726867468.00003: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867468.00004: variable 'ansible_pipelining' from source: unknown 25039 1726867468.00006: variable 'ansible_timeout' from source: unknown 25039 1726867468.00010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867468.00073: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867468.00093: variable 'omit' from source: magic vars 25039 1726867468.00107: starting attempt loop 25039 1726867468.00117: running the handler 25039 1726867468.00196: variable 'ansible_facts' from source: unknown 25039 1726867468.00729: _low_level_execute_command(): starting 25039 1726867468.00735: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867468.01181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867468.01185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867468.01188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867468.01190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867468.01242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867468.01246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867468.01250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867468.01303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867468.02982: stdout chunk (state=3): >>>/root <<< 25039 1726867468.03082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867468.03112: stderr chunk (state=3): >>><<< 25039 1726867468.03114: stdout chunk (state=3): >>><<< 25039 1726867468.03182: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867468.03186: _low_level_execute_command(): starting 25039 1726867468.03189: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060 `" && echo ansible-tmp-1726867468.0312636-26228-186181623779060="` echo /root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060 `" ) && sleep 0' 25039 1726867468.03503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867468.03521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867468.03537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867468.03571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867468.03587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867468.03644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867468.05497: stdout chunk (state=3): >>>ansible-tmp-1726867468.0312636-26228-186181623779060=/root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060 <<< 25039 1726867468.05633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867468.05636: stdout chunk (state=3): >>><<< 25039 1726867468.05642: stderr chunk (state=3): >>><<< 25039 1726867468.05653: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867468.0312636-26228-186181623779060=/root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867468.05674: variable 'ansible_module_compression' from source: unknown 25039 1726867468.05717: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 25039 1726867468.05760: variable 'ansible_facts' from source: unknown 25039 1726867468.05892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/AnsiballZ_systemd.py 25039 1726867468.05983: Sending initial data 25039 1726867468.05986: Sent initial data (156 bytes) 25039 1726867468.06397: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867468.06400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867468.06406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867468.06408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867468.06411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867468.06450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867468.06454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867468.06512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867468.08034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867468.08073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867468.08119: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpmsph1q50 /root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/AnsiballZ_systemd.py <<< 25039 1726867468.08122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/AnsiballZ_systemd.py" <<< 25039 1726867468.08166: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpmsph1q50" to remote "/root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/AnsiballZ_systemd.py" <<< 25039 1726867468.09225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867468.09257: stderr chunk (state=3): >>><<< 25039 1726867468.09260: stdout chunk (state=3): >>><<< 25039 1726867468.09302: done transferring module to remote 25039 1726867468.09312: _low_level_execute_command(): starting 25039 1726867468.09315: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/ /root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/AnsiballZ_systemd.py && sleep 0' 25039 1726867468.09721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867468.09724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867468.09727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867468.09729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867468.09730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867468.09732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867468.09782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867468.09789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867468.09832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867468.11540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867468.11561: stderr chunk (state=3): >>><<< 25039 1726867468.11564: stdout chunk (state=3): >>><<< 25039 1726867468.11575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867468.11580: _low_level_execute_command(): starting 25039 1726867468.11585: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/AnsiballZ_systemd.py && sleep 0' 25039 1726867468.11993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867468.11996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867468.11998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867468.12000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867468.12002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867468.12050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867468.12056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867468.12104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867468.41140: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10866688", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302055936", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1577998000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 25039 1726867468.41146: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25039 1726867468.43240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867468.43244: stdout chunk (state=3): >>><<< 25039 1726867468.43247: stderr chunk (state=3): >>><<< 25039 1726867468.43285: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10866688", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302055936", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1577998000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867468.43668: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867468.43783: _low_level_execute_command(): starting 25039 1726867468.43787: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867468.0312636-26228-186181623779060/ > /dev/null 2>&1 && sleep 0' 25039 1726867468.44975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867468.45230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867468.45332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867468.47104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867468.47145: stderr chunk (state=3): >>><<< 25039 1726867468.47383: stdout chunk (state=3): >>><<< 25039 1726867468.47386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867468.47388: handler run complete 25039 1726867468.47390: attempt loop complete, returning result 25039 1726867468.47392: _execute() done 25039 1726867468.47393: dumping result to json 25039 1726867468.47395: done dumping result, returning 25039 1726867468.47397: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-3ddc-7272-00000000007a] 25039 1726867468.47399: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007a 25039 1726867468.47836: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007a 25039 1726867468.47841: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867468.47901: no more pending results, returning what we have 25039 1726867468.47905: results queue empty 25039 1726867468.47906: checking for any_errors_fatal 25039 1726867468.47913: done checking for any_errors_fatal 25039 1726867468.47914: checking for max_fail_percentage 25039 1726867468.47916: done checking for max_fail_percentage 25039 1726867468.47917: checking to see if all hosts have failed and the running result is not ok 25039 1726867468.47918: done checking to see if all hosts have failed 25039 1726867468.47919: getting the remaining hosts for this loop 25039 1726867468.47921: done getting the remaining hosts for this loop 25039 1726867468.47924: getting the next task for host managed_node1 25039 1726867468.47933: done getting next task for host managed_node1 25039 1726867468.47936: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25039 1726867468.47939: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867468.47952: getting variables 25039 1726867468.47954: in VariableManager get_vars() 25039 1726867468.47994: Calling all_inventory to load vars for managed_node1 25039 1726867468.47997: Calling groups_inventory to load vars for managed_node1 25039 1726867468.48000: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867468.48011: Calling all_plugins_play to load vars for managed_node1 25039 1726867468.48015: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867468.48019: Calling groups_plugins_play to load vars for managed_node1 25039 1726867468.51207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867468.54551: done with get_vars() 25039 1726867468.54579: done getting variables 25039 1726867468.54642: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:24:28 -0400 (0:00:00.627) 0:00:26.073 ****** 25039 1726867468.54744: entering _queue_task() for managed_node1/service 25039 1726867468.55584: worker is 1 (out of 1 available) 25039 1726867468.55598: exiting _queue_task() for managed_node1/service 25039 1726867468.55612: done queuing things up, now waiting for results queue to drain 25039 1726867468.55614: waiting for pending results... 25039 1726867468.56395: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25039 1726867468.56430: in run() - task 0affcac9-a3a5-3ddc-7272-00000000007b 25039 1726867468.56452: variable 'ansible_search_path' from source: unknown 25039 1726867468.56462: variable 'ansible_search_path' from source: unknown 25039 1726867468.56507: calling self._execute() 25039 1726867468.56984: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867468.56988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867468.56990: variable 'omit' from source: magic vars 25039 1726867468.57783: variable 'ansible_distribution_major_version' from source: facts 25039 1726867468.57787: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867468.57789: variable 'network_provider' from source: set_fact 25039 1726867468.57792: Evaluated conditional (network_provider == "nm"): True 25039 1726867468.57917: variable '__network_wpa_supplicant_required' from source: role '' defaults 25039 1726867468.58174: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25039 1726867468.58556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867468.63324: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867468.63545: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867468.63589: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867468.63627: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867468.64083: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867468.64088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867468.64091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867468.64093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867468.64095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867468.64120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867468.64171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867468.64411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867468.64444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867468.64491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867468.64514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867468.64558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867468.64588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867468.64882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867468.64886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867468.64888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867468.65020: variable 'network_connections' from source: task vars 25039 1726867468.65196: variable 'interface' from source: play vars 25039 1726867468.65262: variable 'interface' from source: play vars 25039 1726867468.65340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867468.65740: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867468.65922: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867468.65957: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867468.65993: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867468.66042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25039 1726867468.66110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25039 1726867468.66216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867468.66249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25039 1726867468.66403: variable '__network_wireless_connections_defined' from source: role '' defaults 25039 1726867468.66857: variable 'network_connections' from source: task vars 25039 1726867468.66868: variable 'interface' from source: play vars 25039 1726867468.66947: variable 'interface' from source: play vars 25039 1726867468.67116: Evaluated conditional (__network_wpa_supplicant_required): False 25039 1726867468.67124: when evaluation is False, skipping this task 25039 1726867468.67132: _execute() done 25039 1726867468.67139: dumping result to json 25039 1726867468.67148: done dumping result, returning 25039 1726867468.67162: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-3ddc-7272-00000000007b] 25039 1726867468.67185: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007b skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25039 1726867468.67342: no more pending results, returning what we have 25039 1726867468.67346: results queue empty 25039 1726867468.67347: checking for any_errors_fatal 25039 1726867468.67364: done checking for any_errors_fatal 25039 1726867468.67365: checking for max_fail_percentage 25039 1726867468.67366: done checking for max_fail_percentage 25039 1726867468.67367: checking to see if all hosts have failed and the running result is not ok 25039 1726867468.67369: done checking to see if all hosts have failed 25039 1726867468.67369: getting the remaining hosts for this loop 25039 1726867468.67371: done getting the remaining hosts for this loop 25039 1726867468.67374: getting the next task for host managed_node1 25039 1726867468.67385: done getting next task for host managed_node1 25039 1726867468.67389: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25039 1726867468.67392: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867468.67410: getting variables 25039 1726867468.67412: in VariableManager get_vars() 25039 1726867468.67451: Calling all_inventory to load vars for managed_node1 25039 1726867468.67453: Calling groups_inventory to load vars for managed_node1 25039 1726867468.67455: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867468.67466: Calling all_plugins_play to load vars for managed_node1 25039 1726867468.67469: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867468.67471: Calling groups_plugins_play to load vars for managed_node1 25039 1726867468.68285: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007b 25039 1726867468.68289: WORKER PROCESS EXITING 25039 1726867468.70628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867468.73832: done with get_vars() 25039 1726867468.73855: done getting variables 25039 1726867468.73924: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:24:28 -0400 (0:00:00.192) 0:00:26.265 ****** 25039 1726867468.73960: entering _queue_task() for managed_node1/service 25039 1726867468.74924: worker is 1 (out of 1 available) 25039 1726867468.74937: exiting _queue_task() for managed_node1/service 25039 1726867468.74950: done queuing things up, now waiting for results queue to drain 25039 1726867468.74951: waiting for pending results... 25039 1726867468.75271: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 25039 1726867468.75473: in run() - task 0affcac9-a3a5-3ddc-7272-00000000007c 25039 1726867468.75735: variable 'ansible_search_path' from source: unknown 25039 1726867468.75739: variable 'ansible_search_path' from source: unknown 25039 1726867468.75741: calling self._execute() 25039 1726867468.75786: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867468.75853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867468.75868: variable 'omit' from source: magic vars 25039 1726867468.76567: variable 'ansible_distribution_major_version' from source: facts 25039 1726867468.76614: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867468.76798: variable 'network_provider' from source: set_fact 25039 1726867468.76982: Evaluated conditional (network_provider == "initscripts"): False 25039 1726867468.76986: when evaluation is False, skipping this task 25039 1726867468.76989: _execute() done 25039 1726867468.76991: dumping result to json 25039 1726867468.76993: done dumping result, returning 25039 1726867468.76995: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-3ddc-7272-00000000007c] 25039 1726867468.76997: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007c skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25039 1726867468.77180: no more pending results, returning what we have 25039 1726867468.77184: results queue empty 25039 1726867468.77185: checking for any_errors_fatal 25039 1726867468.77190: done checking for any_errors_fatal 25039 1726867468.77191: checking for max_fail_percentage 25039 1726867468.77193: done checking for max_fail_percentage 25039 1726867468.77193: checking to see if all hosts have failed and the running result is not ok 25039 1726867468.77194: done checking to see if all hosts have failed 25039 1726867468.77195: getting the remaining hosts for this loop 25039 1726867468.77196: done getting the remaining hosts for this loop 25039 1726867468.77199: getting the next task for host managed_node1 25039 1726867468.77207: done getting next task for host managed_node1 25039 1726867468.77212: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25039 1726867468.77216: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867468.77236: getting variables 25039 1726867468.77238: in VariableManager get_vars() 25039 1726867468.77381: Calling all_inventory to load vars for managed_node1 25039 1726867468.77384: Calling groups_inventory to load vars for managed_node1 25039 1726867468.77387: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867468.77393: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007c 25039 1726867468.77395: WORKER PROCESS EXITING 25039 1726867468.77411: Calling all_plugins_play to load vars for managed_node1 25039 1726867468.77414: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867468.77417: Calling groups_plugins_play to load vars for managed_node1 25039 1726867468.79856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867468.83387: done with get_vars() 25039 1726867468.83410: done getting variables 25039 1726867468.83464: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:24:28 -0400 (0:00:00.095) 0:00:26.360 ****** 25039 1726867468.83501: entering _queue_task() for managed_node1/copy 25039 1726867468.84220: worker is 1 (out of 1 available) 25039 1726867468.84233: exiting _queue_task() for managed_node1/copy 25039 1726867468.84244: done queuing things up, now waiting for results queue to drain 25039 1726867468.84246: waiting for pending results... 25039 1726867468.84572: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25039 1726867468.84866: in run() - task 0affcac9-a3a5-3ddc-7272-00000000007d 25039 1726867468.84889: variable 'ansible_search_path' from source: unknown 25039 1726867468.84933: variable 'ansible_search_path' from source: unknown 25039 1726867468.84975: calling self._execute() 25039 1726867468.85200: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867468.85248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867468.85269: variable 'omit' from source: magic vars 25039 1726867468.86091: variable 'ansible_distribution_major_version' from source: facts 25039 1726867468.86134: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867468.86384: variable 'network_provider' from source: set_fact 25039 1726867468.86532: Evaluated conditional (network_provider == "initscripts"): False 25039 1726867468.86536: when evaluation is False, skipping this task 25039 1726867468.86538: _execute() done 25039 1726867468.86541: dumping result to json 25039 1726867468.86543: done dumping result, returning 25039 1726867468.86546: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-3ddc-7272-00000000007d] 25039 1726867468.86548: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007d 25039 1726867468.86625: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007d 25039 1726867468.86629: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25039 1726867468.86682: no more pending results, returning what we have 25039 1726867468.86686: results queue empty 25039 1726867468.86688: checking for any_errors_fatal 25039 1726867468.86692: done checking for any_errors_fatal 25039 1726867468.86693: checking for max_fail_percentage 25039 1726867468.86695: done checking for max_fail_percentage 25039 1726867468.86696: checking to see if all hosts have failed and the running result is not ok 25039 1726867468.86696: done checking to see if all hosts have failed 25039 1726867468.86697: getting the remaining hosts for this loop 25039 1726867468.86698: done getting the remaining hosts for this loop 25039 1726867468.86702: getting the next task for host managed_node1 25039 1726867468.86711: done getting next task for host managed_node1 25039 1726867468.86714: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25039 1726867468.86717: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867468.86736: getting variables 25039 1726867468.86738: in VariableManager get_vars() 25039 1726867468.86774: Calling all_inventory to load vars for managed_node1 25039 1726867468.86779: Calling groups_inventory to load vars for managed_node1 25039 1726867468.86782: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867468.86794: Calling all_plugins_play to load vars for managed_node1 25039 1726867468.86798: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867468.86802: Calling groups_plugins_play to load vars for managed_node1 25039 1726867468.88776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867468.90612: done with get_vars() 25039 1726867468.90632: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:24:28 -0400 (0:00:00.072) 0:00:26.432 ****** 25039 1726867468.90716: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 25039 1726867468.91352: worker is 1 (out of 1 available) 25039 1726867468.91364: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 25039 1726867468.91584: done queuing things up, now waiting for results queue to drain 25039 1726867468.91586: waiting for pending results... 25039 1726867468.91815: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25039 1726867468.91913: in run() - task 0affcac9-a3a5-3ddc-7272-00000000007e 25039 1726867468.91918: variable 'ansible_search_path' from source: unknown 25039 1726867468.92022: variable 'ansible_search_path' from source: unknown 25039 1726867468.92026: calling self._execute() 25039 1726867468.92081: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867468.92096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867468.92110: variable 'omit' from source: magic vars 25039 1726867468.92505: variable 'ansible_distribution_major_version' from source: facts 25039 1726867468.92523: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867468.92534: variable 'omit' from source: magic vars 25039 1726867468.92600: variable 'omit' from source: magic vars 25039 1726867468.92765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867468.95086: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867468.95150: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867468.95198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867468.95237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867468.95268: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867468.95355: variable 'network_provider' from source: set_fact 25039 1726867468.95512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867468.95536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867468.95566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867468.95620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867468.95729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867468.95732: variable 'omit' from source: magic vars 25039 1726867468.95839: variable 'omit' from source: magic vars 25039 1726867468.95946: variable 'network_connections' from source: task vars 25039 1726867468.95962: variable 'interface' from source: play vars 25039 1726867468.96027: variable 'interface' from source: play vars 25039 1726867468.96185: variable 'omit' from source: magic vars 25039 1726867468.96199: variable '__lsr_ansible_managed' from source: task vars 25039 1726867468.96275: variable '__lsr_ansible_managed' from source: task vars 25039 1726867468.96846: Loaded config def from plugin (lookup/template) 25039 1726867468.96858: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25039 1726867468.96891: File lookup term: get_ansible_managed.j2 25039 1726867468.96898: variable 'ansible_search_path' from source: unknown 25039 1726867468.96910: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25039 1726867468.96934: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25039 1726867468.96956: variable 'ansible_search_path' from source: unknown 25039 1726867469.04186: variable 'ansible_managed' from source: unknown 25039 1726867469.04399: variable 'omit' from source: magic vars 25039 1726867469.04402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867469.04421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867469.04447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867469.04470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867469.04487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867469.04518: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867469.04535: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.04543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.04709: Set connection var ansible_shell_executable to /bin/sh 25039 1726867469.04750: Set connection var ansible_timeout to 10 25039 1726867469.04753: Set connection var ansible_shell_type to sh 25039 1726867469.04755: Set connection var ansible_connection to ssh 25039 1726867469.04757: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867469.04759: Set connection var ansible_pipelining to False 25039 1726867469.04785: variable 'ansible_shell_executable' from source: unknown 25039 1726867469.04792: variable 'ansible_connection' from source: unknown 25039 1726867469.04799: variable 'ansible_module_compression' from source: unknown 25039 1726867469.04860: variable 'ansible_shell_type' from source: unknown 25039 1726867469.04863: variable 'ansible_shell_executable' from source: unknown 25039 1726867469.04865: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.04868: variable 'ansible_pipelining' from source: unknown 25039 1726867469.04870: variable 'ansible_timeout' from source: unknown 25039 1726867469.04872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.04981: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867469.05004: variable 'omit' from source: magic vars 25039 1726867469.05014: starting attempt loop 25039 1726867469.05022: running the handler 25039 1726867469.05037: _low_level_execute_command(): starting 25039 1726867469.05080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867469.05736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867469.05756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.05792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.05872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.05897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867469.05913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867469.05938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.06018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867469.07698: stdout chunk (state=3): >>>/root <<< 25039 1726867469.07833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867469.07858: stdout chunk (state=3): >>><<< 25039 1726867469.07861: stderr chunk (state=3): >>><<< 25039 1726867469.07967: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867469.07972: _low_level_execute_command(): starting 25039 1726867469.07974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583 `" && echo ansible-tmp-1726867469.078875-26274-159988228682583="` echo /root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583 `" ) && sleep 0' 25039 1726867469.08485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867469.08497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.08511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867469.08594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.08635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867469.08648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867469.08668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.08747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867469.10629: stdout chunk (state=3): >>>ansible-tmp-1726867469.078875-26274-159988228682583=/root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583 <<< 25039 1726867469.10765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867469.10774: stdout chunk (state=3): >>><<< 25039 1726867469.10792: stderr chunk (state=3): >>><<< 25039 1726867469.10987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867469.078875-26274-159988228682583=/root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867469.10991: variable 'ansible_module_compression' from source: unknown 25039 1726867469.10993: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 25039 1726867469.10996: variable 'ansible_facts' from source: unknown 25039 1726867469.11090: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/AnsiballZ_network_connections.py 25039 1726867469.11222: Sending initial data 25039 1726867469.11325: Sent initial data (167 bytes) 25039 1726867469.11858: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867469.11881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.11897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867469.11921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867469.11936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867469.11988: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.12047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867469.12067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867469.12102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.12168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867469.13705: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25039 1726867469.13735: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867469.13774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867469.13828: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpyyc7_rzu /root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/AnsiballZ_network_connections.py <<< 25039 1726867469.13831: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/AnsiballZ_network_connections.py" <<< 25039 1726867469.13873: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpyyc7_rzu" to remote "/root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/AnsiballZ_network_connections.py" <<< 25039 1726867469.14948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867469.14951: stdout chunk (state=3): >>><<< 25039 1726867469.14953: stderr chunk (state=3): >>><<< 25039 1726867469.14973: done transferring module to remote 25039 1726867469.14991: _low_level_execute_command(): starting 25039 1726867469.15000: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/ /root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/AnsiballZ_network_connections.py && sleep 0' 25039 1726867469.15646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867469.15662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.15679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867469.15697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867469.15724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867469.15793: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.15841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867469.15855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867469.15879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.15955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867469.17754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867469.17758: stdout chunk (state=3): >>><<< 25039 1726867469.17761: stderr chunk (state=3): >>><<< 25039 1726867469.17782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867469.17792: _low_level_execute_command(): starting 25039 1726867469.17880: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/AnsiballZ_network_connections.py && sleep 0' 25039 1726867469.18391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867469.18400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.18427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867469.18430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867469.18536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.18544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867469.18547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867469.18570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.18647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867469.50185: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lvxn6bab/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lvxn6bab/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/05d68ca3-7a29-47b4-8db1-5de4d05c6555: error=unknown <<< 25039 1726867469.50243: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25039 1726867469.52198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867469.52231: stderr chunk (state=3): >>><<< 25039 1726867469.52234: stdout chunk (state=3): >>><<< 25039 1726867469.52258: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lvxn6bab/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lvxn6bab/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/05d68ca3-7a29-47b4-8db1-5de4d05c6555: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867469.52310: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867469.52317: _low_level_execute_command(): starting 25039 1726867469.52323: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867469.078875-26274-159988228682583/ > /dev/null 2>&1 && sleep 0' 25039 1726867469.53637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867469.53656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.53673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867469.53865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.53910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867469.53931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867469.54030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.54147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867469.56148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867469.56223: stderr chunk (state=3): >>><<< 25039 1726867469.56233: stdout chunk (state=3): >>><<< 25039 1726867469.56260: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867469.56463: handler run complete 25039 1726867469.56466: attempt loop complete, returning result 25039 1726867469.56469: _execute() done 25039 1726867469.56471: dumping result to json 25039 1726867469.56473: done dumping result, returning 25039 1726867469.56475: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-3ddc-7272-00000000007e] 25039 1726867469.56479: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007e 25039 1726867469.56557: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007e changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 25039 1726867469.56672: no more pending results, returning what we have 25039 1726867469.56780: results queue empty 25039 1726867469.56782: checking for any_errors_fatal 25039 1726867469.56788: done checking for any_errors_fatal 25039 1726867469.56789: checking for max_fail_percentage 25039 1726867469.56791: done checking for max_fail_percentage 25039 1726867469.56792: checking to see if all hosts have failed and the running result is not ok 25039 1726867469.56793: done checking to see if all hosts have failed 25039 1726867469.56794: getting the remaining hosts for this loop 25039 1726867469.56795: done getting the remaining hosts for this loop 25039 1726867469.56798: getting the next task for host managed_node1 25039 1726867469.56806: done getting next task for host managed_node1 25039 1726867469.56812: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25039 1726867469.56815: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867469.56827: getting variables 25039 1726867469.56829: in VariableManager get_vars() 25039 1726867469.56869: Calling all_inventory to load vars for managed_node1 25039 1726867469.56872: Calling groups_inventory to load vars for managed_node1 25039 1726867469.56875: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867469.56990: Calling all_plugins_play to load vars for managed_node1 25039 1726867469.56994: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867469.56998: Calling groups_plugins_play to load vars for managed_node1 25039 1726867469.57591: WORKER PROCESS EXITING 25039 1726867469.58926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867469.60522: done with get_vars() 25039 1726867469.60543: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:24:29 -0400 (0:00:00.699) 0:00:27.131 ****** 25039 1726867469.60635: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 25039 1726867469.60963: worker is 1 (out of 1 available) 25039 1726867469.60976: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 25039 1726867469.61189: done queuing things up, now waiting for results queue to drain 25039 1726867469.61191: waiting for pending results... 25039 1726867469.61266: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 25039 1726867469.61399: in run() - task 0affcac9-a3a5-3ddc-7272-00000000007f 25039 1726867469.61428: variable 'ansible_search_path' from source: unknown 25039 1726867469.61436: variable 'ansible_search_path' from source: unknown 25039 1726867469.61487: calling self._execute() 25039 1726867469.61603: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.61618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.61855: variable 'omit' from source: magic vars 25039 1726867469.62726: variable 'ansible_distribution_major_version' from source: facts 25039 1726867469.62729: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867469.62840: variable 'network_state' from source: role '' defaults 25039 1726867469.62857: Evaluated conditional (network_state != {}): False 25039 1726867469.62866: when evaluation is False, skipping this task 25039 1726867469.62874: _execute() done 25039 1726867469.63051: dumping result to json 25039 1726867469.63054: done dumping result, returning 25039 1726867469.63057: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-3ddc-7272-00000000007f] 25039 1726867469.63060: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007f 25039 1726867469.63132: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000007f 25039 1726867469.63136: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25039 1726867469.63214: no more pending results, returning what we have 25039 1726867469.63219: results queue empty 25039 1726867469.63220: checking for any_errors_fatal 25039 1726867469.63232: done checking for any_errors_fatal 25039 1726867469.63233: checking for max_fail_percentage 25039 1726867469.63235: done checking for max_fail_percentage 25039 1726867469.63236: checking to see if all hosts have failed and the running result is not ok 25039 1726867469.63237: done checking to see if all hosts have failed 25039 1726867469.63238: getting the remaining hosts for this loop 25039 1726867469.63239: done getting the remaining hosts for this loop 25039 1726867469.63243: getting the next task for host managed_node1 25039 1726867469.63252: done getting next task for host managed_node1 25039 1726867469.63257: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25039 1726867469.63260: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867469.63284: getting variables 25039 1726867469.63286: in VariableManager get_vars() 25039 1726867469.63331: Calling all_inventory to load vars for managed_node1 25039 1726867469.63334: Calling groups_inventory to load vars for managed_node1 25039 1726867469.63337: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867469.63350: Calling all_plugins_play to load vars for managed_node1 25039 1726867469.63354: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867469.63358: Calling groups_plugins_play to load vars for managed_node1 25039 1726867469.65418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867469.67460: done with get_vars() 25039 1726867469.67484: done getting variables 25039 1726867469.67547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:24:29 -0400 (0:00:00.069) 0:00:27.201 ****** 25039 1726867469.67585: entering _queue_task() for managed_node1/debug 25039 1726867469.68121: worker is 1 (out of 1 available) 25039 1726867469.68134: exiting _queue_task() for managed_node1/debug 25039 1726867469.68148: done queuing things up, now waiting for results queue to drain 25039 1726867469.68150: waiting for pending results... 25039 1726867469.68418: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25039 1726867469.68565: in run() - task 0affcac9-a3a5-3ddc-7272-000000000080 25039 1726867469.68617: variable 'ansible_search_path' from source: unknown 25039 1726867469.68621: variable 'ansible_search_path' from source: unknown 25039 1726867469.68644: calling self._execute() 25039 1726867469.68750: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.68835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.68838: variable 'omit' from source: magic vars 25039 1726867469.69433: variable 'ansible_distribution_major_version' from source: facts 25039 1726867469.69452: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867469.69465: variable 'omit' from source: magic vars 25039 1726867469.69685: variable 'omit' from source: magic vars 25039 1726867469.69701: variable 'omit' from source: magic vars 25039 1726867469.69755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867469.69858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867469.69957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867469.69982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867469.69999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867469.70075: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867469.70261: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.70264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.70267: Set connection var ansible_shell_executable to /bin/sh 25039 1726867469.70414: Set connection var ansible_timeout to 10 25039 1726867469.70424: Set connection var ansible_shell_type to sh 25039 1726867469.70430: Set connection var ansible_connection to ssh 25039 1726867469.70441: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867469.70457: Set connection var ansible_pipelining to False 25039 1726867469.70534: variable 'ansible_shell_executable' from source: unknown 25039 1726867469.70542: variable 'ansible_connection' from source: unknown 25039 1726867469.70548: variable 'ansible_module_compression' from source: unknown 25039 1726867469.70554: variable 'ansible_shell_type' from source: unknown 25039 1726867469.70560: variable 'ansible_shell_executable' from source: unknown 25039 1726867469.70700: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.70703: variable 'ansible_pipelining' from source: unknown 25039 1726867469.70705: variable 'ansible_timeout' from source: unknown 25039 1726867469.70707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.70946: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867469.70963: variable 'omit' from source: magic vars 25039 1726867469.70991: starting attempt loop 25039 1726867469.71000: running the handler 25039 1726867469.71328: variable '__network_connections_result' from source: set_fact 25039 1726867469.71404: handler run complete 25039 1726867469.71456: attempt loop complete, returning result 25039 1726867469.71493: _execute() done 25039 1726867469.71594: dumping result to json 25039 1726867469.71597: done dumping result, returning 25039 1726867469.71600: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-3ddc-7272-000000000080] 25039 1726867469.71602: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000080 ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 25039 1726867469.71894: no more pending results, returning what we have 25039 1726867469.71898: results queue empty 25039 1726867469.71899: checking for any_errors_fatal 25039 1726867469.71905: done checking for any_errors_fatal 25039 1726867469.71906: checking for max_fail_percentage 25039 1726867469.71908: done checking for max_fail_percentage 25039 1726867469.71909: checking to see if all hosts have failed and the running result is not ok 25039 1726867469.71910: done checking to see if all hosts have failed 25039 1726867469.71910: getting the remaining hosts for this loop 25039 1726867469.71912: done getting the remaining hosts for this loop 25039 1726867469.71915: getting the next task for host managed_node1 25039 1726867469.71922: done getting next task for host managed_node1 25039 1726867469.71925: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25039 1726867469.71928: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867469.71943: getting variables 25039 1726867469.71945: in VariableManager get_vars() 25039 1726867469.71986: Calling all_inventory to load vars for managed_node1 25039 1726867469.71989: Calling groups_inventory to load vars for managed_node1 25039 1726867469.71992: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867469.72004: Calling all_plugins_play to load vars for managed_node1 25039 1726867469.72007: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867469.72010: Calling groups_plugins_play to load vars for managed_node1 25039 1726867469.72621: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000080 25039 1726867469.72624: WORKER PROCESS EXITING 25039 1726867469.76335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867469.79961: done with get_vars() 25039 1726867469.79991: done getting variables 25039 1726867469.80041: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:24:29 -0400 (0:00:00.124) 0:00:27.326 ****** 25039 1726867469.80071: entering _queue_task() for managed_node1/debug 25039 1726867469.80342: worker is 1 (out of 1 available) 25039 1726867469.80355: exiting _queue_task() for managed_node1/debug 25039 1726867469.80368: done queuing things up, now waiting for results queue to drain 25039 1726867469.80370: waiting for pending results... 25039 1726867469.80551: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25039 1726867469.80643: in run() - task 0affcac9-a3a5-3ddc-7272-000000000081 25039 1726867469.80655: variable 'ansible_search_path' from source: unknown 25039 1726867469.80659: variable 'ansible_search_path' from source: unknown 25039 1726867469.80692: calling self._execute() 25039 1726867469.80771: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.80776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.80821: variable 'omit' from source: magic vars 25039 1726867469.81203: variable 'ansible_distribution_major_version' from source: facts 25039 1726867469.81214: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867469.81268: variable 'omit' from source: magic vars 25039 1726867469.81271: variable 'omit' from source: magic vars 25039 1726867469.81312: variable 'omit' from source: magic vars 25039 1726867469.81350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867469.81385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867469.81404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867469.81422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867469.81492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867469.81495: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867469.81498: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.81500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.81563: Set connection var ansible_shell_executable to /bin/sh 25039 1726867469.81568: Set connection var ansible_timeout to 10 25039 1726867469.81575: Set connection var ansible_shell_type to sh 25039 1726867469.81580: Set connection var ansible_connection to ssh 25039 1726867469.81599: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867469.81602: Set connection var ansible_pipelining to False 25039 1726867469.81685: variable 'ansible_shell_executable' from source: unknown 25039 1726867469.81690: variable 'ansible_connection' from source: unknown 25039 1726867469.81697: variable 'ansible_module_compression' from source: unknown 25039 1726867469.81700: variable 'ansible_shell_type' from source: unknown 25039 1726867469.81704: variable 'ansible_shell_executable' from source: unknown 25039 1726867469.81706: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.81710: variable 'ansible_pipelining' from source: unknown 25039 1726867469.81712: variable 'ansible_timeout' from source: unknown 25039 1726867469.81714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.81939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867469.81943: variable 'omit' from source: magic vars 25039 1726867469.81945: starting attempt loop 25039 1726867469.81948: running the handler 25039 1726867469.82075: variable '__network_connections_result' from source: set_fact 25039 1726867469.82124: variable '__network_connections_result' from source: set_fact 25039 1726867469.82271: handler run complete 25039 1726867469.82312: attempt loop complete, returning result 25039 1726867469.82321: _execute() done 25039 1726867469.82327: dumping result to json 25039 1726867469.82335: done dumping result, returning 25039 1726867469.82383: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-3ddc-7272-000000000081] 25039 1726867469.82388: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000081 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 25039 1726867469.82538: no more pending results, returning what we have 25039 1726867469.82542: results queue empty 25039 1726867469.82544: checking for any_errors_fatal 25039 1726867469.82551: done checking for any_errors_fatal 25039 1726867469.82552: checking for max_fail_percentage 25039 1726867469.82554: done checking for max_fail_percentage 25039 1726867469.82555: checking to see if all hosts have failed and the running result is not ok 25039 1726867469.82556: done checking to see if all hosts have failed 25039 1726867469.82557: getting the remaining hosts for this loop 25039 1726867469.82559: done getting the remaining hosts for this loop 25039 1726867469.82562: getting the next task for host managed_node1 25039 1726867469.82571: done getting next task for host managed_node1 25039 1726867469.82575: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25039 1726867469.82580: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867469.82593: getting variables 25039 1726867469.82595: in VariableManager get_vars() 25039 1726867469.82636: Calling all_inventory to load vars for managed_node1 25039 1726867469.82639: Calling groups_inventory to load vars for managed_node1 25039 1726867469.82642: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867469.82653: Calling all_plugins_play to load vars for managed_node1 25039 1726867469.82657: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867469.82661: Calling groups_plugins_play to load vars for managed_node1 25039 1726867469.83581: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000081 25039 1726867469.83584: WORKER PROCESS EXITING 25039 1726867469.83598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867469.84458: done with get_vars() 25039 1726867469.84473: done getting variables 25039 1726867469.84517: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:24:29 -0400 (0:00:00.044) 0:00:27.371 ****** 25039 1726867469.84543: entering _queue_task() for managed_node1/debug 25039 1726867469.84760: worker is 1 (out of 1 available) 25039 1726867469.84774: exiting _queue_task() for managed_node1/debug 25039 1726867469.84790: done queuing things up, now waiting for results queue to drain 25039 1726867469.84792: waiting for pending results... 25039 1726867469.84971: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25039 1726867469.85066: in run() - task 0affcac9-a3a5-3ddc-7272-000000000082 25039 1726867469.85079: variable 'ansible_search_path' from source: unknown 25039 1726867469.85084: variable 'ansible_search_path' from source: unknown 25039 1726867469.85113: calling self._execute() 25039 1726867469.85186: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.85189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.85199: variable 'omit' from source: magic vars 25039 1726867469.85471: variable 'ansible_distribution_major_version' from source: facts 25039 1726867469.85483: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867469.85562: variable 'network_state' from source: role '' defaults 25039 1726867469.85570: Evaluated conditional (network_state != {}): False 25039 1726867469.85574: when evaluation is False, skipping this task 25039 1726867469.85576: _execute() done 25039 1726867469.85581: dumping result to json 25039 1726867469.85584: done dumping result, returning 25039 1726867469.85595: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-3ddc-7272-000000000082] 25039 1726867469.85598: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000082 25039 1726867469.85680: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000082 25039 1726867469.85683: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 25039 1726867469.85731: no more pending results, returning what we have 25039 1726867469.85735: results queue empty 25039 1726867469.85736: checking for any_errors_fatal 25039 1726867469.85743: done checking for any_errors_fatal 25039 1726867469.85744: checking for max_fail_percentage 25039 1726867469.85745: done checking for max_fail_percentage 25039 1726867469.85746: checking to see if all hosts have failed and the running result is not ok 25039 1726867469.85747: done checking to see if all hosts have failed 25039 1726867469.85748: getting the remaining hosts for this loop 25039 1726867469.85749: done getting the remaining hosts for this loop 25039 1726867469.85752: getting the next task for host managed_node1 25039 1726867469.85758: done getting next task for host managed_node1 25039 1726867469.85761: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25039 1726867469.85764: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867469.85780: getting variables 25039 1726867469.85781: in VariableManager get_vars() 25039 1726867469.85814: Calling all_inventory to load vars for managed_node1 25039 1726867469.85817: Calling groups_inventory to load vars for managed_node1 25039 1726867469.85819: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867469.85827: Calling all_plugins_play to load vars for managed_node1 25039 1726867469.85829: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867469.85832: Calling groups_plugins_play to load vars for managed_node1 25039 1726867469.90670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867469.91752: done with get_vars() 25039 1726867469.91768: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:24:29 -0400 (0:00:00.072) 0:00:27.443 ****** 25039 1726867469.91829: entering _queue_task() for managed_node1/ping 25039 1726867469.92099: worker is 1 (out of 1 available) 25039 1726867469.92114: exiting _queue_task() for managed_node1/ping 25039 1726867469.92127: done queuing things up, now waiting for results queue to drain 25039 1726867469.92129: waiting for pending results... 25039 1726867469.92314: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 25039 1726867469.92406: in run() - task 0affcac9-a3a5-3ddc-7272-000000000083 25039 1726867469.92422: variable 'ansible_search_path' from source: unknown 25039 1726867469.92427: variable 'ansible_search_path' from source: unknown 25039 1726867469.92455: calling self._execute() 25039 1726867469.92534: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.92539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.92547: variable 'omit' from source: magic vars 25039 1726867469.92832: variable 'ansible_distribution_major_version' from source: facts 25039 1726867469.92841: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867469.92847: variable 'omit' from source: magic vars 25039 1726867469.92939: variable 'omit' from source: magic vars 25039 1726867469.93183: variable 'omit' from source: magic vars 25039 1726867469.93186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867469.93191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867469.93194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867469.93196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867469.93199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867469.93201: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867469.93203: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.93205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.93285: Set connection var ansible_shell_executable to /bin/sh 25039 1726867469.93297: Set connection var ansible_timeout to 10 25039 1726867469.93306: Set connection var ansible_shell_type to sh 25039 1726867469.93323: Set connection var ansible_connection to ssh 25039 1726867469.93340: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867469.93349: Set connection var ansible_pipelining to False 25039 1726867469.93379: variable 'ansible_shell_executable' from source: unknown 25039 1726867469.93388: variable 'ansible_connection' from source: unknown 25039 1726867469.93395: variable 'ansible_module_compression' from source: unknown 25039 1726867469.93400: variable 'ansible_shell_type' from source: unknown 25039 1726867469.93406: variable 'ansible_shell_executable' from source: unknown 25039 1726867469.93416: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867469.93422: variable 'ansible_pipelining' from source: unknown 25039 1726867469.93438: variable 'ansible_timeout' from source: unknown 25039 1726867469.93446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867469.93673: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25039 1726867469.93695: variable 'omit' from source: magic vars 25039 1726867469.93723: starting attempt loop 25039 1726867469.93727: running the handler 25039 1726867469.93731: _low_level_execute_command(): starting 25039 1726867469.93748: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867469.94259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867469.94263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.94267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.94324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867469.94327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.94376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867469.96043: stdout chunk (state=3): >>>/root <<< 25039 1726867469.96141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867469.96168: stderr chunk (state=3): >>><<< 25039 1726867469.96172: stdout chunk (state=3): >>><<< 25039 1726867469.96198: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867469.96210: _low_level_execute_command(): starting 25039 1726867469.96214: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004 `" && echo ansible-tmp-1726867469.961949-26319-95249720223004="` echo /root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004 `" ) && sleep 0' 25039 1726867469.96648: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.96652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867469.96654: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867469.96657: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867469.96667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.96710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867469.96716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.96765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867469.98644: stdout chunk (state=3): >>>ansible-tmp-1726867469.961949-26319-95249720223004=/root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004 <<< 25039 1726867469.98755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867469.98781: stderr chunk (state=3): >>><<< 25039 1726867469.98784: stdout chunk (state=3): >>><<< 25039 1726867469.98799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867469.961949-26319-95249720223004=/root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867469.98836: variable 'ansible_module_compression' from source: unknown 25039 1726867469.98872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 25039 1726867469.98903: variable 'ansible_facts' from source: unknown 25039 1726867469.98963: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/AnsiballZ_ping.py 25039 1726867469.99055: Sending initial data 25039 1726867469.99059: Sent initial data (151 bytes) 25039 1726867469.99473: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867469.99511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867469.99515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867469.99517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.99519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867469.99522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867469.99568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867469.99575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867469.99619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867470.01135: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867470.01139: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867470.01176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867470.01222: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmph8bdgsq3 /root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/AnsiballZ_ping.py <<< 25039 1726867470.01226: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/AnsiballZ_ping.py" <<< 25039 1726867470.01270: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmph8bdgsq3" to remote "/root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/AnsiballZ_ping.py" <<< 25039 1726867470.01802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867470.01836: stderr chunk (state=3): >>><<< 25039 1726867470.01840: stdout chunk (state=3): >>><<< 25039 1726867470.01858: done transferring module to remote 25039 1726867470.01866: _low_level_execute_command(): starting 25039 1726867470.01868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/ /root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/AnsiballZ_ping.py && sleep 0' 25039 1726867470.02346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.02441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867470.04220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867470.04229: stdout chunk (state=3): >>><<< 25039 1726867470.04232: stderr chunk (state=3): >>><<< 25039 1726867470.04327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867470.04331: _low_level_execute_command(): starting 25039 1726867470.04334: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/AnsiballZ_ping.py && sleep 0' 25039 1726867470.04854: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867470.04869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867470.04887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.04914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867470.04935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867470.04948: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867470.04962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.04992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.05036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.05096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867470.05117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867470.05164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867470.05217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867470.20319: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25039 1726867470.21450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867470.21465: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 25039 1726867470.21537: stderr chunk (state=3): >>><<< 25039 1726867470.21718: stdout chunk (state=3): >>><<< 25039 1726867470.21722: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867470.21725: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867470.21728: _low_level_execute_command(): starting 25039 1726867470.21730: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867469.961949-26319-95249720223004/ > /dev/null 2>&1 && sleep 0' 25039 1726867470.22997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867470.23101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867470.23294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867470.23333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867470.23436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867470.25259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867470.25440: stderr chunk (state=3): >>><<< 25039 1726867470.25448: stdout chunk (state=3): >>><<< 25039 1726867470.25467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867470.25487: handler run complete 25039 1726867470.25516: attempt loop complete, returning result 25039 1726867470.25524: _execute() done 25039 1726867470.25530: dumping result to json 25039 1726867470.25537: done dumping result, returning 25039 1726867470.25550: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-3ddc-7272-000000000083] 25039 1726867470.25683: sending task result for task 0affcac9-a3a5-3ddc-7272-000000000083 ok: [managed_node1] => { "changed": false, "ping": "pong" } 25039 1726867470.25886: no more pending results, returning what we have 25039 1726867470.25890: results queue empty 25039 1726867470.25891: checking for any_errors_fatal 25039 1726867470.25899: done checking for any_errors_fatal 25039 1726867470.25899: checking for max_fail_percentage 25039 1726867470.25901: done checking for max_fail_percentage 25039 1726867470.25902: checking to see if all hosts have failed and the running result is not ok 25039 1726867470.25903: done checking to see if all hosts have failed 25039 1726867470.25904: getting the remaining hosts for this loop 25039 1726867470.25905: done getting the remaining hosts for this loop 25039 1726867470.25910: getting the next task for host managed_node1 25039 1726867470.25921: done getting next task for host managed_node1 25039 1726867470.25923: ^ task is: TASK: meta (role_complete) 25039 1726867470.25927: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867470.25940: getting variables 25039 1726867470.25942: in VariableManager get_vars() 25039 1726867470.26191: Calling all_inventory to load vars for managed_node1 25039 1726867470.26195: Calling groups_inventory to load vars for managed_node1 25039 1726867470.26198: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.26212: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.26215: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.26218: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.26903: done sending task result for task 0affcac9-a3a5-3ddc-7272-000000000083 25039 1726867470.26907: WORKER PROCESS EXITING 25039 1726867470.29254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.33033: done with get_vars() 25039 1726867470.33056: done getting variables 25039 1726867470.33195: done queuing things up, now waiting for results queue to drain 25039 1726867470.33198: results queue empty 25039 1726867470.33198: checking for any_errors_fatal 25039 1726867470.33201: done checking for any_errors_fatal 25039 1726867470.33202: checking for max_fail_percentage 25039 1726867470.33203: done checking for max_fail_percentage 25039 1726867470.33204: checking to see if all hosts have failed and the running result is not ok 25039 1726867470.33205: done checking to see if all hosts have failed 25039 1726867470.33206: getting the remaining hosts for this loop 25039 1726867470.33207: done getting the remaining hosts for this loop 25039 1726867470.33212: getting the next task for host managed_node1 25039 1726867470.33217: done getting next task for host managed_node1 25039 1726867470.33220: ^ task is: TASK: Include the task 'manage_test_interface.yml' 25039 1726867470.33221: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867470.33224: getting variables 25039 1726867470.33225: in VariableManager get_vars() 25039 1726867470.33355: Calling all_inventory to load vars for managed_node1 25039 1726867470.33358: Calling groups_inventory to load vars for managed_node1 25039 1726867470.33361: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.33367: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.33369: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.33372: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.35785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.39007: done with get_vars() 25039 1726867470.39149: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Friday 20 September 2024 17:24:30 -0400 (0:00:00.474) 0:00:27.918 ****** 25039 1726867470.39301: entering _queue_task() for managed_node1/include_tasks 25039 1726867470.40122: worker is 1 (out of 1 available) 25039 1726867470.40133: exiting _queue_task() for managed_node1/include_tasks 25039 1726867470.40144: done queuing things up, now waiting for results queue to drain 25039 1726867470.40145: waiting for pending results... 25039 1726867470.40486: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 25039 1726867470.40675: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000b3 25039 1726867470.40841: variable 'ansible_search_path' from source: unknown 25039 1726867470.40888: calling self._execute() 25039 1726867470.41028: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867470.41035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867470.41159: variable 'omit' from source: magic vars 25039 1726867470.41843: variable 'ansible_distribution_major_version' from source: facts 25039 1726867470.41862: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867470.41865: _execute() done 25039 1726867470.41868: dumping result to json 25039 1726867470.41871: done dumping result, returning 25039 1726867470.41875: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0affcac9-a3a5-3ddc-7272-0000000000b3] 25039 1726867470.41881: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000b3 25039 1726867470.42216: no more pending results, returning what we have 25039 1726867470.42222: in VariableManager get_vars() 25039 1726867470.42267: Calling all_inventory to load vars for managed_node1 25039 1726867470.42269: Calling groups_inventory to load vars for managed_node1 25039 1726867470.42271: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.42291: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.42295: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.42384: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.42999: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000b3 25039 1726867470.43002: WORKER PROCESS EXITING 25039 1726867470.44486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.47068: done with get_vars() 25039 1726867470.47093: variable 'ansible_search_path' from source: unknown 25039 1726867470.47116: we have included files to process 25039 1726867470.47117: generating all_blocks data 25039 1726867470.47119: done generating all_blocks data 25039 1726867470.47125: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25039 1726867470.47126: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25039 1726867470.47129: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25039 1726867470.47558: in VariableManager get_vars() 25039 1726867470.47583: done with get_vars() 25039 1726867470.48384: done processing included file 25039 1726867470.48387: iterating over new_blocks loaded from include file 25039 1726867470.48388: in VariableManager get_vars() 25039 1726867470.48416: done with get_vars() 25039 1726867470.48418: filtering new block on tags 25039 1726867470.48450: done filtering new block on tags 25039 1726867470.48454: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 25039 1726867470.48459: extending task lists for all hosts with included blocks 25039 1726867470.50996: done extending task lists 25039 1726867470.50997: done processing included files 25039 1726867470.50998: results queue empty 25039 1726867470.50999: checking for any_errors_fatal 25039 1726867470.51001: done checking for any_errors_fatal 25039 1726867470.51001: checking for max_fail_percentage 25039 1726867470.51006: done checking for max_fail_percentage 25039 1726867470.51010: checking to see if all hosts have failed and the running result is not ok 25039 1726867470.51011: done checking to see if all hosts have failed 25039 1726867470.51012: getting the remaining hosts for this loop 25039 1726867470.51013: done getting the remaining hosts for this loop 25039 1726867470.51016: getting the next task for host managed_node1 25039 1726867470.51020: done getting next task for host managed_node1 25039 1726867470.51022: ^ task is: TASK: Ensure state in ["present", "absent"] 25039 1726867470.51024: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867470.51027: getting variables 25039 1726867470.51028: in VariableManager get_vars() 25039 1726867470.51041: Calling all_inventory to load vars for managed_node1 25039 1726867470.51043: Calling groups_inventory to load vars for managed_node1 25039 1726867470.51045: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.51050: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.51052: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.51055: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.52276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.53280: done with get_vars() 25039 1726867470.53294: done getting variables 25039 1726867470.53324: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 17:24:30 -0400 (0:00:00.140) 0:00:28.059 ****** 25039 1726867470.53343: entering _queue_task() for managed_node1/fail 25039 1726867470.53596: worker is 1 (out of 1 available) 25039 1726867470.53612: exiting _queue_task() for managed_node1/fail 25039 1726867470.53625: done queuing things up, now waiting for results queue to drain 25039 1726867470.53626: waiting for pending results... 25039 1726867470.53793: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 25039 1726867470.53852: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005cc 25039 1726867470.53864: variable 'ansible_search_path' from source: unknown 25039 1726867470.53867: variable 'ansible_search_path' from source: unknown 25039 1726867470.53896: calling self._execute() 25039 1726867470.53971: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867470.53975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867470.53986: variable 'omit' from source: magic vars 25039 1726867470.54283: variable 'ansible_distribution_major_version' from source: facts 25039 1726867470.54345: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867470.54503: variable 'state' from source: include params 25039 1726867470.54506: Evaluated conditional (state not in ["present", "absent"]): False 25039 1726867470.54511: when evaluation is False, skipping this task 25039 1726867470.54512: _execute() done 25039 1726867470.54515: dumping result to json 25039 1726867470.54516: done dumping result, returning 25039 1726867470.54518: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0affcac9-a3a5-3ddc-7272-0000000005cc] 25039 1726867470.54519: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005cc 25039 1726867470.54574: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005cc 25039 1726867470.54579: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 25039 1726867470.54633: no more pending results, returning what we have 25039 1726867470.54638: results queue empty 25039 1726867470.54639: checking for any_errors_fatal 25039 1726867470.54641: done checking for any_errors_fatal 25039 1726867470.54642: checking for max_fail_percentage 25039 1726867470.54644: done checking for max_fail_percentage 25039 1726867470.54645: checking to see if all hosts have failed and the running result is not ok 25039 1726867470.54646: done checking to see if all hosts have failed 25039 1726867470.54647: getting the remaining hosts for this loop 25039 1726867470.54648: done getting the remaining hosts for this loop 25039 1726867470.54652: getting the next task for host managed_node1 25039 1726867470.54659: done getting next task for host managed_node1 25039 1726867470.54662: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 25039 1726867470.54666: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867470.54672: getting variables 25039 1726867470.54674: in VariableManager get_vars() 25039 1726867470.54738: Calling all_inventory to load vars for managed_node1 25039 1726867470.54741: Calling groups_inventory to load vars for managed_node1 25039 1726867470.54744: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.54757: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.54760: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.54763: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.55892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.56775: done with get_vars() 25039 1726867470.56791: done getting variables 25039 1726867470.56830: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 17:24:30 -0400 (0:00:00.035) 0:00:28.094 ****** 25039 1726867470.56850: entering _queue_task() for managed_node1/fail 25039 1726867470.57066: worker is 1 (out of 1 available) 25039 1726867470.57081: exiting _queue_task() for managed_node1/fail 25039 1726867470.57093: done queuing things up, now waiting for results queue to drain 25039 1726867470.57095: waiting for pending results... 25039 1726867470.57250: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 25039 1726867470.57314: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005cd 25039 1726867470.57325: variable 'ansible_search_path' from source: unknown 25039 1726867470.57329: variable 'ansible_search_path' from source: unknown 25039 1726867470.57355: calling self._execute() 25039 1726867470.57427: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867470.57431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867470.57445: variable 'omit' from source: magic vars 25039 1726867470.57882: variable 'ansible_distribution_major_version' from source: facts 25039 1726867470.57886: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867470.58013: variable 'type' from source: play vars 25039 1726867470.58034: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 25039 1726867470.58043: when evaluation is False, skipping this task 25039 1726867470.58051: _execute() done 25039 1726867470.58082: dumping result to json 25039 1726867470.58085: done dumping result, returning 25039 1726867470.58088: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcac9-a3a5-3ddc-7272-0000000005cd] 25039 1726867470.58090: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005cd 25039 1726867470.58359: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005cd 25039 1726867470.58363: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 25039 1726867470.58406: no more pending results, returning what we have 25039 1726867470.58411: results queue empty 25039 1726867470.58412: checking for any_errors_fatal 25039 1726867470.58416: done checking for any_errors_fatal 25039 1726867470.58417: checking for max_fail_percentage 25039 1726867470.58418: done checking for max_fail_percentage 25039 1726867470.58419: checking to see if all hosts have failed and the running result is not ok 25039 1726867470.58420: done checking to see if all hosts have failed 25039 1726867470.58420: getting the remaining hosts for this loop 25039 1726867470.58422: done getting the remaining hosts for this loop 25039 1726867470.58424: getting the next task for host managed_node1 25039 1726867470.58430: done getting next task for host managed_node1 25039 1726867470.58432: ^ task is: TASK: Include the task 'show_interfaces.yml' 25039 1726867470.58434: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867470.58438: getting variables 25039 1726867470.58439: in VariableManager get_vars() 25039 1726867470.58475: Calling all_inventory to load vars for managed_node1 25039 1726867470.58479: Calling groups_inventory to load vars for managed_node1 25039 1726867470.58482: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.58492: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.58495: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.58498: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.59410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.60365: done with get_vars() 25039 1726867470.60388: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 17:24:30 -0400 (0:00:00.036) 0:00:28.130 ****** 25039 1726867470.60482: entering _queue_task() for managed_node1/include_tasks 25039 1726867470.60736: worker is 1 (out of 1 available) 25039 1726867470.60747: exiting _queue_task() for managed_node1/include_tasks 25039 1726867470.60757: done queuing things up, now waiting for results queue to drain 25039 1726867470.60758: waiting for pending results... 25039 1726867470.61026: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 25039 1726867470.61146: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005ce 25039 1726867470.61157: variable 'ansible_search_path' from source: unknown 25039 1726867470.61165: variable 'ansible_search_path' from source: unknown 25039 1726867470.61201: calling self._execute() 25039 1726867470.61322: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867470.61325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867470.61328: variable 'omit' from source: magic vars 25039 1726867470.61767: variable 'ansible_distribution_major_version' from source: facts 25039 1726867470.61772: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867470.61774: _execute() done 25039 1726867470.61776: dumping result to json 25039 1726867470.61780: done dumping result, returning 25039 1726867470.61783: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-3ddc-7272-0000000005ce] 25039 1726867470.61785: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005ce 25039 1726867470.61846: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005ce 25039 1726867470.61849: WORKER PROCESS EXITING 25039 1726867470.61885: no more pending results, returning what we have 25039 1726867470.61891: in VariableManager get_vars() 25039 1726867470.61934: Calling all_inventory to load vars for managed_node1 25039 1726867470.61938: Calling groups_inventory to load vars for managed_node1 25039 1726867470.61940: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.61954: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.61957: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.61960: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.62996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.63844: done with get_vars() 25039 1726867470.63857: variable 'ansible_search_path' from source: unknown 25039 1726867470.63858: variable 'ansible_search_path' from source: unknown 25039 1726867470.63883: we have included files to process 25039 1726867470.63884: generating all_blocks data 25039 1726867470.63885: done generating all_blocks data 25039 1726867470.63889: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867470.63889: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867470.63891: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25039 1726867470.63956: in VariableManager get_vars() 25039 1726867470.63971: done with get_vars() 25039 1726867470.64044: done processing included file 25039 1726867470.64046: iterating over new_blocks loaded from include file 25039 1726867470.64046: in VariableManager get_vars() 25039 1726867470.64061: done with get_vars() 25039 1726867470.64063: filtering new block on tags 25039 1726867470.64073: done filtering new block on tags 25039 1726867470.64075: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 25039 1726867470.64080: extending task lists for all hosts with included blocks 25039 1726867470.64459: done extending task lists 25039 1726867470.64461: done processing included files 25039 1726867470.64462: results queue empty 25039 1726867470.64463: checking for any_errors_fatal 25039 1726867470.64466: done checking for any_errors_fatal 25039 1726867470.64467: checking for max_fail_percentage 25039 1726867470.64468: done checking for max_fail_percentage 25039 1726867470.64469: checking to see if all hosts have failed and the running result is not ok 25039 1726867470.64470: done checking to see if all hosts have failed 25039 1726867470.64471: getting the remaining hosts for this loop 25039 1726867470.64481: done getting the remaining hosts for this loop 25039 1726867470.64484: getting the next task for host managed_node1 25039 1726867470.64489: done getting next task for host managed_node1 25039 1726867470.64491: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25039 1726867470.64493: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867470.64496: getting variables 25039 1726867470.64497: in VariableManager get_vars() 25039 1726867470.64513: Calling all_inventory to load vars for managed_node1 25039 1726867470.64515: Calling groups_inventory to load vars for managed_node1 25039 1726867470.64517: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.64522: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.64524: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.64527: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.65465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.66307: done with get_vars() 25039 1726867470.66325: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:24:30 -0400 (0:00:00.058) 0:00:28.189 ****** 25039 1726867470.66373: entering _queue_task() for managed_node1/include_tasks 25039 1726867470.66594: worker is 1 (out of 1 available) 25039 1726867470.66607: exiting _queue_task() for managed_node1/include_tasks 25039 1726867470.66621: done queuing things up, now waiting for results queue to drain 25039 1726867470.66624: waiting for pending results... 25039 1726867470.66790: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 25039 1726867470.66863: in run() - task 0affcac9-a3a5-3ddc-7272-0000000006e4 25039 1726867470.66872: variable 'ansible_search_path' from source: unknown 25039 1726867470.66875: variable 'ansible_search_path' from source: unknown 25039 1726867470.66906: calling self._execute() 25039 1726867470.66979: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867470.66983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867470.66991: variable 'omit' from source: magic vars 25039 1726867470.67252: variable 'ansible_distribution_major_version' from source: facts 25039 1726867470.67262: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867470.67269: _execute() done 25039 1726867470.67272: dumping result to json 25039 1726867470.67275: done dumping result, returning 25039 1726867470.67283: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-3ddc-7272-0000000006e4] 25039 1726867470.67294: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000006e4 25039 1726867470.67368: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000006e4 25039 1726867470.67370: WORKER PROCESS EXITING 25039 1726867470.67416: no more pending results, returning what we have 25039 1726867470.67421: in VariableManager get_vars() 25039 1726867470.67462: Calling all_inventory to load vars for managed_node1 25039 1726867470.67465: Calling groups_inventory to load vars for managed_node1 25039 1726867470.67467: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.67479: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.67482: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.67485: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.68245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.69201: done with get_vars() 25039 1726867470.69218: variable 'ansible_search_path' from source: unknown 25039 1726867470.69220: variable 'ansible_search_path' from source: unknown 25039 1726867470.69275: we have included files to process 25039 1726867470.69276: generating all_blocks data 25039 1726867470.69280: done generating all_blocks data 25039 1726867470.69281: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867470.69282: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867470.69287: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25039 1726867470.69534: done processing included file 25039 1726867470.69536: iterating over new_blocks loaded from include file 25039 1726867470.69538: in VariableManager get_vars() 25039 1726867470.69555: done with get_vars() 25039 1726867470.69557: filtering new block on tags 25039 1726867470.69573: done filtering new block on tags 25039 1726867470.69575: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 25039 1726867470.69582: extending task lists for all hosts with included blocks 25039 1726867470.69724: done extending task lists 25039 1726867470.69725: done processing included files 25039 1726867470.69726: results queue empty 25039 1726867470.69727: checking for any_errors_fatal 25039 1726867470.69729: done checking for any_errors_fatal 25039 1726867470.69730: checking for max_fail_percentage 25039 1726867470.69731: done checking for max_fail_percentage 25039 1726867470.69732: checking to see if all hosts have failed and the running result is not ok 25039 1726867470.69733: done checking to see if all hosts have failed 25039 1726867470.69733: getting the remaining hosts for this loop 25039 1726867470.69734: done getting the remaining hosts for this loop 25039 1726867470.69737: getting the next task for host managed_node1 25039 1726867470.69741: done getting next task for host managed_node1 25039 1726867470.69744: ^ task is: TASK: Gather current interface info 25039 1726867470.69747: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867470.69749: getting variables 25039 1726867470.69750: in VariableManager get_vars() 25039 1726867470.69763: Calling all_inventory to load vars for managed_node1 25039 1726867470.69765: Calling groups_inventory to load vars for managed_node1 25039 1726867470.69767: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867470.69771: Calling all_plugins_play to load vars for managed_node1 25039 1726867470.69774: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867470.69776: Calling groups_plugins_play to load vars for managed_node1 25039 1726867470.70541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867470.71384: done with get_vars() 25039 1726867470.71397: done getting variables 25039 1726867470.71424: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:24:30 -0400 (0:00:00.050) 0:00:28.240 ****** 25039 1726867470.71445: entering _queue_task() for managed_node1/command 25039 1726867470.71652: worker is 1 (out of 1 available) 25039 1726867470.71666: exiting _queue_task() for managed_node1/command 25039 1726867470.71678: done queuing things up, now waiting for results queue to drain 25039 1726867470.71680: waiting for pending results... 25039 1726867470.72047: running TaskExecutor() for managed_node1/TASK: Gather current interface info 25039 1726867470.72052: in run() - task 0affcac9-a3a5-3ddc-7272-00000000071b 25039 1726867470.72057: variable 'ansible_search_path' from source: unknown 25039 1726867470.72059: variable 'ansible_search_path' from source: unknown 25039 1726867470.72062: calling self._execute() 25039 1726867470.72126: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867470.72139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867470.72152: variable 'omit' from source: magic vars 25039 1726867470.72505: variable 'ansible_distribution_major_version' from source: facts 25039 1726867470.72534: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867470.72545: variable 'omit' from source: magic vars 25039 1726867470.72601: variable 'omit' from source: magic vars 25039 1726867470.72647: variable 'omit' from source: magic vars 25039 1726867470.72689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867470.72728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867470.72760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867470.72782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867470.72798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867470.72828: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867470.72831: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867470.72833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867470.72955: Set connection var ansible_shell_executable to /bin/sh 25039 1726867470.72959: Set connection var ansible_timeout to 10 25039 1726867470.72962: Set connection var ansible_shell_type to sh 25039 1726867470.72964: Set connection var ansible_connection to ssh 25039 1726867470.72966: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867470.72968: Set connection var ansible_pipelining to False 25039 1726867470.72970: variable 'ansible_shell_executable' from source: unknown 25039 1726867470.72972: variable 'ansible_connection' from source: unknown 25039 1726867470.72975: variable 'ansible_module_compression' from source: unknown 25039 1726867470.72979: variable 'ansible_shell_type' from source: unknown 25039 1726867470.72982: variable 'ansible_shell_executable' from source: unknown 25039 1726867470.72984: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867470.72986: variable 'ansible_pipelining' from source: unknown 25039 1726867470.72988: variable 'ansible_timeout' from source: unknown 25039 1726867470.72990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867470.73121: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867470.73174: variable 'omit' from source: magic vars 25039 1726867470.73179: starting attempt loop 25039 1726867470.73182: running the handler 25039 1726867470.73184: _low_level_execute_command(): starting 25039 1726867470.73187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867470.73880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.73890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867470.73899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.73916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867470.73941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.73992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.74028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867470.74032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867470.74049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867470.74136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867470.75829: stdout chunk (state=3): >>>/root <<< 25039 1726867470.75956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867470.75958: stdout chunk (state=3): >>><<< 25039 1726867470.75960: stderr chunk (state=3): >>><<< 25039 1726867470.75972: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867470.75990: _low_level_execute_command(): starting 25039 1726867470.76001: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949 `" && echo ansible-tmp-1726867470.759797-26356-201039580149949="` echo /root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949 `" ) && sleep 0' 25039 1726867470.76412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.76415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867470.76427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.76429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.76431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.76473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867470.76478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867470.76531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867470.78396: stdout chunk (state=3): >>>ansible-tmp-1726867470.759797-26356-201039580149949=/root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949 <<< 25039 1726867470.78510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867470.78526: stderr chunk (state=3): >>><<< 25039 1726867470.78529: stdout chunk (state=3): >>><<< 25039 1726867470.78544: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867470.759797-26356-201039580149949=/root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867470.78567: variable 'ansible_module_compression' from source: unknown 25039 1726867470.78607: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867470.78640: variable 'ansible_facts' from source: unknown 25039 1726867470.78695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/AnsiballZ_command.py 25039 1726867470.78789: Sending initial data 25039 1726867470.78793: Sent initial data (155 bytes) 25039 1726867470.79182: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.79206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.79213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.79215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.79271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867470.79275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867470.79341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867470.81101: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867470.81145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867470.81211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpqvmto0mh /root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/AnsiballZ_command.py <<< 25039 1726867470.81216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/AnsiballZ_command.py" <<< 25039 1726867470.81242: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpqvmto0mh" to remote "/root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/AnsiballZ_command.py" <<< 25039 1726867470.82226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867470.82229: stdout chunk (state=3): >>><<< 25039 1726867470.82231: stderr chunk (state=3): >>><<< 25039 1726867470.82233: done transferring module to remote 25039 1726867470.82292: _low_level_execute_command(): starting 25039 1726867470.82306: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/ /root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/AnsiballZ_command.py && sleep 0' 25039 1726867470.83829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867470.83833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867470.83835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867470.83841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.83888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867470.84110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867470.84183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867470.85905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867470.85938: stderr chunk (state=3): >>><<< 25039 1726867470.85942: stdout chunk (state=3): >>><<< 25039 1726867470.85960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867470.85963: _low_level_execute_command(): starting 25039 1726867470.85968: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/AnsiballZ_command.py && sleep 0' 25039 1726867470.86734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867470.86737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867470.86739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.86741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867470.86743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867470.86745: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867470.86747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.86749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867470.86751: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867470.86753: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25039 1726867470.86951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867470.86954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867470.86956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867470.86959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867470.86961: stderr chunk (state=3): >>>debug2: match found <<< 25039 1726867470.86963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867470.86965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867470.87090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867470.87283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.02710: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:24:31.021815", "end": "2024-09-20 17:24:31.025114", "delta": "0:00:00.003299", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867471.04275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867471.04281: stdout chunk (state=3): >>><<< 25039 1726867471.04306: stderr chunk (state=3): >>><<< 25039 1726867471.04310: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:24:31.021815", "end": "2024-09-20 17:24:31.025114", "delta": "0:00:00.003299", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867471.04351: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867471.04383: _low_level_execute_command(): starting 25039 1726867471.04387: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867470.759797-26356-201039580149949/ > /dev/null 2>&1 && sleep 0' 25039 1726867471.04940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867471.04965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867471.04968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867471.04970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.05012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867471.05023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.05082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.06929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867471.06932: stdout chunk (state=3): >>><<< 25039 1726867471.06934: stderr chunk (state=3): >>><<< 25039 1726867471.06948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867471.07086: handler run complete 25039 1726867471.07090: Evaluated conditional (False): False 25039 1726867471.07092: attempt loop complete, returning result 25039 1726867471.07094: _execute() done 25039 1726867471.07096: dumping result to json 25039 1726867471.07098: done dumping result, returning 25039 1726867471.07100: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affcac9-a3a5-3ddc-7272-00000000071b] 25039 1726867471.07103: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000071b 25039 1726867471.07216: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000071b 25039 1726867471.07220: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003299", "end": "2024-09-20 17:24:31.025114", "rc": 0, "start": "2024-09-20 17:24:31.021815" } STDOUT: bonding_masters eth0 lo veth0 25039 1726867471.07304: no more pending results, returning what we have 25039 1726867471.07308: results queue empty 25039 1726867471.07309: checking for any_errors_fatal 25039 1726867471.07311: done checking for any_errors_fatal 25039 1726867471.07312: checking for max_fail_percentage 25039 1726867471.07313: done checking for max_fail_percentage 25039 1726867471.07314: checking to see if all hosts have failed and the running result is not ok 25039 1726867471.07315: done checking to see if all hosts have failed 25039 1726867471.07316: getting the remaining hosts for this loop 25039 1726867471.07317: done getting the remaining hosts for this loop 25039 1726867471.07320: getting the next task for host managed_node1 25039 1726867471.07330: done getting next task for host managed_node1 25039 1726867471.07332: ^ task is: TASK: Set current_interfaces 25039 1726867471.07337: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867471.07342: getting variables 25039 1726867471.07343: in VariableManager get_vars() 25039 1726867471.07385: Calling all_inventory to load vars for managed_node1 25039 1726867471.07388: Calling groups_inventory to load vars for managed_node1 25039 1726867471.07390: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867471.07407: Calling all_plugins_play to load vars for managed_node1 25039 1726867471.07410: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867471.07413: Calling groups_plugins_play to load vars for managed_node1 25039 1726867471.08211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867471.09236: done with get_vars() 25039 1726867471.09257: done getting variables 25039 1726867471.09325: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:24:31 -0400 (0:00:00.379) 0:00:28.619 ****** 25039 1726867471.09358: entering _queue_task() for managed_node1/set_fact 25039 1726867471.09662: worker is 1 (out of 1 available) 25039 1726867471.09822: exiting _queue_task() for managed_node1/set_fact 25039 1726867471.09836: done queuing things up, now waiting for results queue to drain 25039 1726867471.09837: waiting for pending results... 25039 1726867471.10095: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 25039 1726867471.10104: in run() - task 0affcac9-a3a5-3ddc-7272-00000000071c 25039 1726867471.10118: variable 'ansible_search_path' from source: unknown 25039 1726867471.10125: variable 'ansible_search_path' from source: unknown 25039 1726867471.10163: calling self._execute() 25039 1726867471.10256: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.10267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.10282: variable 'omit' from source: magic vars 25039 1726867471.10663: variable 'ansible_distribution_major_version' from source: facts 25039 1726867471.10687: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867471.10700: variable 'omit' from source: magic vars 25039 1726867471.10740: variable 'omit' from source: magic vars 25039 1726867471.10829: variable '_current_interfaces' from source: set_fact 25039 1726867471.10881: variable 'omit' from source: magic vars 25039 1726867471.10917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867471.10947: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867471.10966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867471.10981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867471.10992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867471.11018: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867471.11021: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.11024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.11095: Set connection var ansible_shell_executable to /bin/sh 25039 1726867471.11101: Set connection var ansible_timeout to 10 25039 1726867471.11106: Set connection var ansible_shell_type to sh 25039 1726867471.11109: Set connection var ansible_connection to ssh 25039 1726867471.11118: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867471.11122: Set connection var ansible_pipelining to False 25039 1726867471.11140: variable 'ansible_shell_executable' from source: unknown 25039 1726867471.11143: variable 'ansible_connection' from source: unknown 25039 1726867471.11145: variable 'ansible_module_compression' from source: unknown 25039 1726867471.11148: variable 'ansible_shell_type' from source: unknown 25039 1726867471.11150: variable 'ansible_shell_executable' from source: unknown 25039 1726867471.11153: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.11155: variable 'ansible_pipelining' from source: unknown 25039 1726867471.11157: variable 'ansible_timeout' from source: unknown 25039 1726867471.11162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.11265: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867471.11272: variable 'omit' from source: magic vars 25039 1726867471.11283: starting attempt loop 25039 1726867471.11286: running the handler 25039 1726867471.11293: handler run complete 25039 1726867471.11301: attempt loop complete, returning result 25039 1726867471.11303: _execute() done 25039 1726867471.11307: dumping result to json 25039 1726867471.11309: done dumping result, returning 25039 1726867471.11319: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affcac9-a3a5-3ddc-7272-00000000071c] 25039 1726867471.11321: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000071c 25039 1726867471.11397: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000071c 25039 1726867471.11400: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 25039 1726867471.11472: no more pending results, returning what we have 25039 1726867471.11475: results queue empty 25039 1726867471.11476: checking for any_errors_fatal 25039 1726867471.11485: done checking for any_errors_fatal 25039 1726867471.11486: checking for max_fail_percentage 25039 1726867471.11487: done checking for max_fail_percentage 25039 1726867471.11488: checking to see if all hosts have failed and the running result is not ok 25039 1726867471.11489: done checking to see if all hosts have failed 25039 1726867471.11490: getting the remaining hosts for this loop 25039 1726867471.11491: done getting the remaining hosts for this loop 25039 1726867471.11494: getting the next task for host managed_node1 25039 1726867471.11502: done getting next task for host managed_node1 25039 1726867471.11505: ^ task is: TASK: Show current_interfaces 25039 1726867471.11515: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867471.11519: getting variables 25039 1726867471.11520: in VariableManager get_vars() 25039 1726867471.11552: Calling all_inventory to load vars for managed_node1 25039 1726867471.11554: Calling groups_inventory to load vars for managed_node1 25039 1726867471.11556: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867471.11564: Calling all_plugins_play to load vars for managed_node1 25039 1726867471.11567: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867471.11569: Calling groups_plugins_play to load vars for managed_node1 25039 1726867471.12353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867471.13807: done with get_vars() 25039 1726867471.13827: done getting variables 25039 1726867471.13884: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:24:31 -0400 (0:00:00.045) 0:00:28.664 ****** 25039 1726867471.13913: entering _queue_task() for managed_node1/debug 25039 1726867471.14180: worker is 1 (out of 1 available) 25039 1726867471.14192: exiting _queue_task() for managed_node1/debug 25039 1726867471.14204: done queuing things up, now waiting for results queue to drain 25039 1726867471.14205: waiting for pending results... 25039 1726867471.14480: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 25039 1726867471.14562: in run() - task 0affcac9-a3a5-3ddc-7272-0000000006e5 25039 1726867471.14572: variable 'ansible_search_path' from source: unknown 25039 1726867471.14575: variable 'ansible_search_path' from source: unknown 25039 1726867471.14610: calling self._execute() 25039 1726867471.14680: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.14685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.14691: variable 'omit' from source: magic vars 25039 1726867471.14961: variable 'ansible_distribution_major_version' from source: facts 25039 1726867471.14971: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867471.14978: variable 'omit' from source: magic vars 25039 1726867471.15012: variable 'omit' from source: magic vars 25039 1726867471.15080: variable 'current_interfaces' from source: set_fact 25039 1726867471.15101: variable 'omit' from source: magic vars 25039 1726867471.15132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867471.15160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867471.15179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867471.15192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867471.15202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867471.15227: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867471.15230: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.15233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.15305: Set connection var ansible_shell_executable to /bin/sh 25039 1726867471.15311: Set connection var ansible_timeout to 10 25039 1726867471.15314: Set connection var ansible_shell_type to sh 25039 1726867471.15317: Set connection var ansible_connection to ssh 25039 1726867471.15324: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867471.15330: Set connection var ansible_pipelining to False 25039 1726867471.15347: variable 'ansible_shell_executable' from source: unknown 25039 1726867471.15350: variable 'ansible_connection' from source: unknown 25039 1726867471.15355: variable 'ansible_module_compression' from source: unknown 25039 1726867471.15357: variable 'ansible_shell_type' from source: unknown 25039 1726867471.15359: variable 'ansible_shell_executable' from source: unknown 25039 1726867471.15361: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.15363: variable 'ansible_pipelining' from source: unknown 25039 1726867471.15365: variable 'ansible_timeout' from source: unknown 25039 1726867471.15370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.15466: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867471.15476: variable 'omit' from source: magic vars 25039 1726867471.15483: starting attempt loop 25039 1726867471.15486: running the handler 25039 1726867471.15522: handler run complete 25039 1726867471.15531: attempt loop complete, returning result 25039 1726867471.15535: _execute() done 25039 1726867471.15537: dumping result to json 25039 1726867471.15539: done dumping result, returning 25039 1726867471.15547: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affcac9-a3a5-3ddc-7272-0000000006e5] 25039 1726867471.15549: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000006e5 25039 1726867471.15629: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000006e5 25039 1726867471.15632: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 25039 1726867471.15681: no more pending results, returning what we have 25039 1726867471.15685: results queue empty 25039 1726867471.15686: checking for any_errors_fatal 25039 1726867471.15692: done checking for any_errors_fatal 25039 1726867471.15693: checking for max_fail_percentage 25039 1726867471.15694: done checking for max_fail_percentage 25039 1726867471.15695: checking to see if all hosts have failed and the running result is not ok 25039 1726867471.15696: done checking to see if all hosts have failed 25039 1726867471.15697: getting the remaining hosts for this loop 25039 1726867471.15698: done getting the remaining hosts for this loop 25039 1726867471.15701: getting the next task for host managed_node1 25039 1726867471.15712: done getting next task for host managed_node1 25039 1726867471.15715: ^ task is: TASK: Install iproute 25039 1726867471.15717: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867471.15721: getting variables 25039 1726867471.15722: in VariableManager get_vars() 25039 1726867471.15758: Calling all_inventory to load vars for managed_node1 25039 1726867471.15761: Calling groups_inventory to load vars for managed_node1 25039 1726867471.15763: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867471.15772: Calling all_plugins_play to load vars for managed_node1 25039 1726867471.15774: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867471.15779: Calling groups_plugins_play to load vars for managed_node1 25039 1726867471.17002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867471.17935: done with get_vars() 25039 1726867471.17950: done getting variables 25039 1726867471.17993: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 17:24:31 -0400 (0:00:00.041) 0:00:28.705 ****** 25039 1726867471.18015: entering _queue_task() for managed_node1/package 25039 1726867471.18225: worker is 1 (out of 1 available) 25039 1726867471.18238: exiting _queue_task() for managed_node1/package 25039 1726867471.18250: done queuing things up, now waiting for results queue to drain 25039 1726867471.18252: waiting for pending results... 25039 1726867471.18432: running TaskExecutor() for managed_node1/TASK: Install iproute 25039 1726867471.18505: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005cf 25039 1726867471.18519: variable 'ansible_search_path' from source: unknown 25039 1726867471.18522: variable 'ansible_search_path' from source: unknown 25039 1726867471.18548: calling self._execute() 25039 1726867471.18625: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.18631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.18641: variable 'omit' from source: magic vars 25039 1726867471.18914: variable 'ansible_distribution_major_version' from source: facts 25039 1726867471.18925: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867471.18931: variable 'omit' from source: magic vars 25039 1726867471.18956: variable 'omit' from source: magic vars 25039 1726867471.19086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25039 1726867471.20532: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25039 1726867471.20576: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25039 1726867471.20603: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25039 1726867471.20630: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25039 1726867471.20653: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25039 1726867471.20720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25039 1726867471.20748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25039 1726867471.20767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25039 1726867471.20796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25039 1726867471.20807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25039 1726867471.20881: variable '__network_is_ostree' from source: set_fact 25039 1726867471.20885: variable 'omit' from source: magic vars 25039 1726867471.20907: variable 'omit' from source: magic vars 25039 1726867471.20931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867471.20950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867471.20965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867471.20980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867471.20989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867471.21015: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867471.21018: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.21021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.21087: Set connection var ansible_shell_executable to /bin/sh 25039 1726867471.21091: Set connection var ansible_timeout to 10 25039 1726867471.21094: Set connection var ansible_shell_type to sh 25039 1726867471.21097: Set connection var ansible_connection to ssh 25039 1726867471.21105: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867471.21109: Set connection var ansible_pipelining to False 25039 1726867471.21129: variable 'ansible_shell_executable' from source: unknown 25039 1726867471.21132: variable 'ansible_connection' from source: unknown 25039 1726867471.21134: variable 'ansible_module_compression' from source: unknown 25039 1726867471.21137: variable 'ansible_shell_type' from source: unknown 25039 1726867471.21139: variable 'ansible_shell_executable' from source: unknown 25039 1726867471.21141: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.21143: variable 'ansible_pipelining' from source: unknown 25039 1726867471.21146: variable 'ansible_timeout' from source: unknown 25039 1726867471.21151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.21225: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867471.21234: variable 'omit' from source: magic vars 25039 1726867471.21239: starting attempt loop 25039 1726867471.21241: running the handler 25039 1726867471.21248: variable 'ansible_facts' from source: unknown 25039 1726867471.21251: variable 'ansible_facts' from source: unknown 25039 1726867471.21280: _low_level_execute_command(): starting 25039 1726867471.21285: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867471.21756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.21788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.21791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.21793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.21844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867471.21848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867471.21853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.21906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.23576: stdout chunk (state=3): >>>/root <<< 25039 1726867471.23673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867471.23704: stderr chunk (state=3): >>><<< 25039 1726867471.23708: stdout chunk (state=3): >>><<< 25039 1726867471.23729: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867471.23739: _low_level_execute_command(): starting 25039 1726867471.23742: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992 `" && echo ansible-tmp-1726867471.237272-26390-103601229638992="` echo /root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992 `" ) && sleep 0' 25039 1726867471.24153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.24156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.24158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.24160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.24205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867471.24213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.24261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.26118: stdout chunk (state=3): >>>ansible-tmp-1726867471.237272-26390-103601229638992=/root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992 <<< 25039 1726867471.26225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867471.26245: stderr chunk (state=3): >>><<< 25039 1726867471.26248: stdout chunk (state=3): >>><<< 25039 1726867471.26260: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867471.237272-26390-103601229638992=/root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867471.26284: variable 'ansible_module_compression' from source: unknown 25039 1726867471.26332: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 25039 1726867471.26365: variable 'ansible_facts' from source: unknown 25039 1726867471.26447: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/AnsiballZ_dnf.py 25039 1726867471.26542: Sending initial data 25039 1726867471.26545: Sent initial data (151 bytes) 25039 1726867471.26941: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.26944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867471.26979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.26983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.26986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867471.26989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.27034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867471.27045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.27107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.28634: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 25039 1726867471.28638: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867471.28675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867471.28722: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpx2yt1zto /root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/AnsiballZ_dnf.py <<< 25039 1726867471.28725: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/AnsiballZ_dnf.py" <<< 25039 1726867471.28767: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpx2yt1zto" to remote "/root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/AnsiballZ_dnf.py" <<< 25039 1726867471.29439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867471.29475: stderr chunk (state=3): >>><<< 25039 1726867471.29480: stdout chunk (state=3): >>><<< 25039 1726867471.29507: done transferring module to remote 25039 1726867471.29517: _low_level_execute_command(): starting 25039 1726867471.29520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/ /root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/AnsiballZ_dnf.py && sleep 0' 25039 1726867471.29934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867471.29937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867471.29940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867471.29942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867471.29944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.29991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867471.29996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.30042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.31756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867471.31778: stderr chunk (state=3): >>><<< 25039 1726867471.31782: stdout chunk (state=3): >>><<< 25039 1726867471.31793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867471.31797: _low_level_execute_command(): starting 25039 1726867471.31799: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/AnsiballZ_dnf.py && sleep 0' 25039 1726867471.32206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867471.32210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867471.32212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.32214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.32216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.32259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867471.32271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.32333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.72735: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25039 1726867471.76776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867471.76800: stderr chunk (state=3): >>><<< 25039 1726867471.76803: stdout chunk (state=3): >>><<< 25039 1726867471.76823: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867471.76858: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867471.76865: _low_level_execute_command(): starting 25039 1726867471.76868: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867471.237272-26390-103601229638992/ > /dev/null 2>&1 && sleep 0' 25039 1726867471.77282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.77312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867471.77316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.77318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867471.77322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867471.77325: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.77386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867471.77390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.77456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.79241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867471.79263: stderr chunk (state=3): >>><<< 25039 1726867471.79267: stdout chunk (state=3): >>><<< 25039 1726867471.79281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867471.79288: handler run complete 25039 1726867471.79403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25039 1726867471.79528: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25039 1726867471.79556: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25039 1726867471.79580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25039 1726867471.79615: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25039 1726867471.79671: variable '__install_status' from source: set_fact 25039 1726867471.79686: Evaluated conditional (__install_status is success): True 25039 1726867471.79698: attempt loop complete, returning result 25039 1726867471.79700: _execute() done 25039 1726867471.79703: dumping result to json 25039 1726867471.79708: done dumping result, returning 25039 1726867471.79717: done running TaskExecutor() for managed_node1/TASK: Install iproute [0affcac9-a3a5-3ddc-7272-0000000005cf] 25039 1726867471.79721: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005cf 25039 1726867471.79812: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005cf 25039 1726867471.79815: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25039 1726867471.79914: no more pending results, returning what we have 25039 1726867471.79918: results queue empty 25039 1726867471.79919: checking for any_errors_fatal 25039 1726867471.79924: done checking for any_errors_fatal 25039 1726867471.79925: checking for max_fail_percentage 25039 1726867471.79926: done checking for max_fail_percentage 25039 1726867471.79927: checking to see if all hosts have failed and the running result is not ok 25039 1726867471.79930: done checking to see if all hosts have failed 25039 1726867471.79930: getting the remaining hosts for this loop 25039 1726867471.79932: done getting the remaining hosts for this loop 25039 1726867471.79935: getting the next task for host managed_node1 25039 1726867471.79941: done getting next task for host managed_node1 25039 1726867471.79943: ^ task is: TASK: Create veth interface {{ interface }} 25039 1726867471.79946: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867471.79954: getting variables 25039 1726867471.79956: in VariableManager get_vars() 25039 1726867471.79995: Calling all_inventory to load vars for managed_node1 25039 1726867471.79998: Calling groups_inventory to load vars for managed_node1 25039 1726867471.80000: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867471.80010: Calling all_plugins_play to load vars for managed_node1 25039 1726867471.80012: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867471.80015: Calling groups_plugins_play to load vars for managed_node1 25039 1726867471.81324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867471.82480: done with get_vars() 25039 1726867471.82496: done getting variables 25039 1726867471.82540: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867471.82625: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 17:24:31 -0400 (0:00:00.646) 0:00:29.352 ****** 25039 1726867471.82648: entering _queue_task() for managed_node1/command 25039 1726867471.82870: worker is 1 (out of 1 available) 25039 1726867471.82887: exiting _queue_task() for managed_node1/command 25039 1726867471.82901: done queuing things up, now waiting for results queue to drain 25039 1726867471.82902: waiting for pending results... 25039 1726867471.83080: running TaskExecutor() for managed_node1/TASK: Create veth interface veth0 25039 1726867471.83150: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005d0 25039 1726867471.83160: variable 'ansible_search_path' from source: unknown 25039 1726867471.83164: variable 'ansible_search_path' from source: unknown 25039 1726867471.83368: variable 'interface' from source: play vars 25039 1726867471.83431: variable 'interface' from source: play vars 25039 1726867471.83487: variable 'interface' from source: play vars 25039 1726867471.83601: Loaded config def from plugin (lookup/items) 25039 1726867471.83607: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 25039 1726867471.83629: variable 'omit' from source: magic vars 25039 1726867471.83727: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.83735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.83743: variable 'omit' from source: magic vars 25039 1726867471.83994: variable 'ansible_distribution_major_version' from source: facts 25039 1726867471.83997: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867471.84189: variable 'type' from source: play vars 25039 1726867471.84192: variable 'state' from source: include params 25039 1726867471.84195: variable 'interface' from source: play vars 25039 1726867471.84198: variable 'current_interfaces' from source: set_fact 25039 1726867471.84200: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25039 1726867471.84202: when evaluation is False, skipping this task 25039 1726867471.84204: variable 'item' from source: unknown 25039 1726867471.84258: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 25039 1726867471.84418: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.84421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.84424: variable 'omit' from source: magic vars 25039 1726867471.84588: variable 'ansible_distribution_major_version' from source: facts 25039 1726867471.84592: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867471.84663: variable 'type' from source: play vars 25039 1726867471.84666: variable 'state' from source: include params 25039 1726867471.84681: variable 'interface' from source: play vars 25039 1726867471.84684: variable 'current_interfaces' from source: set_fact 25039 1726867471.84735: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25039 1726867471.84738: when evaluation is False, skipping this task 25039 1726867471.84741: variable 'item' from source: unknown 25039 1726867471.84767: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 25039 1726867471.84959: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.84962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.84964: variable 'omit' from source: magic vars 25039 1726867471.84980: variable 'ansible_distribution_major_version' from source: facts 25039 1726867471.84991: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867471.85158: variable 'type' from source: play vars 25039 1726867471.85161: variable 'state' from source: include params 25039 1726867471.85164: variable 'interface' from source: play vars 25039 1726867471.85168: variable 'current_interfaces' from source: set_fact 25039 1726867471.85176: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25039 1726867471.85180: when evaluation is False, skipping this task 25039 1726867471.85195: variable 'item' from source: unknown 25039 1726867471.85242: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 25039 1726867471.85311: dumping result to json 25039 1726867471.85314: done dumping result, returning 25039 1726867471.85316: done running TaskExecutor() for managed_node1/TASK: Create veth interface veth0 [0affcac9-a3a5-3ddc-7272-0000000005d0] 25039 1726867471.85318: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d0 skipping: [managed_node1] => { "changed": false } MSG: All items skipped 25039 1726867471.85395: no more pending results, returning what we have 25039 1726867471.85399: results queue empty 25039 1726867471.85400: checking for any_errors_fatal 25039 1726867471.85406: done checking for any_errors_fatal 25039 1726867471.85407: checking for max_fail_percentage 25039 1726867471.85410: done checking for max_fail_percentage 25039 1726867471.85411: checking to see if all hosts have failed and the running result is not ok 25039 1726867471.85412: done checking to see if all hosts have failed 25039 1726867471.85413: getting the remaining hosts for this loop 25039 1726867471.85414: done getting the remaining hosts for this loop 25039 1726867471.85417: getting the next task for host managed_node1 25039 1726867471.85422: done getting next task for host managed_node1 25039 1726867471.85425: ^ task is: TASK: Set up veth as managed by NetworkManager 25039 1726867471.85428: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867471.85431: getting variables 25039 1726867471.85432: in VariableManager get_vars() 25039 1726867471.85481: Calling all_inventory to load vars for managed_node1 25039 1726867471.85484: Calling groups_inventory to load vars for managed_node1 25039 1726867471.85487: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867471.85492: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d0 25039 1726867471.85494: WORKER PROCESS EXITING 25039 1726867471.85503: Calling all_plugins_play to load vars for managed_node1 25039 1726867471.85505: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867471.85510: Calling groups_plugins_play to load vars for managed_node1 25039 1726867471.86950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867471.88580: done with get_vars() 25039 1726867471.88600: done getting variables 25039 1726867471.88665: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 17:24:31 -0400 (0:00:00.060) 0:00:29.412 ****** 25039 1726867471.88700: entering _queue_task() for managed_node1/command 25039 1726867471.88992: worker is 1 (out of 1 available) 25039 1726867471.89010: exiting _queue_task() for managed_node1/command 25039 1726867471.89024: done queuing things up, now waiting for results queue to drain 25039 1726867471.89026: waiting for pending results... 25039 1726867471.89274: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 25039 1726867471.89371: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005d1 25039 1726867471.89381: variable 'ansible_search_path' from source: unknown 25039 1726867471.89385: variable 'ansible_search_path' from source: unknown 25039 1726867471.89423: calling self._execute() 25039 1726867471.89523: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.89529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.89544: variable 'omit' from source: magic vars 25039 1726867471.89922: variable 'ansible_distribution_major_version' from source: facts 25039 1726867471.89934: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867471.90095: variable 'type' from source: play vars 25039 1726867471.90099: variable 'state' from source: include params 25039 1726867471.90105: Evaluated conditional (type == 'veth' and state == 'present'): False 25039 1726867471.90111: when evaluation is False, skipping this task 25039 1726867471.90114: _execute() done 25039 1726867471.90117: dumping result to json 25039 1726867471.90119: done dumping result, returning 25039 1726867471.90121: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0affcac9-a3a5-3ddc-7272-0000000005d1] 25039 1726867471.90134: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d1 25039 1726867471.90210: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d1 25039 1726867471.90213: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 25039 1726867471.90280: no more pending results, returning what we have 25039 1726867471.90283: results queue empty 25039 1726867471.90284: checking for any_errors_fatal 25039 1726867471.90292: done checking for any_errors_fatal 25039 1726867471.90293: checking for max_fail_percentage 25039 1726867471.90294: done checking for max_fail_percentage 25039 1726867471.90295: checking to see if all hosts have failed and the running result is not ok 25039 1726867471.90296: done checking to see if all hosts have failed 25039 1726867471.90297: getting the remaining hosts for this loop 25039 1726867471.90298: done getting the remaining hosts for this loop 25039 1726867471.90301: getting the next task for host managed_node1 25039 1726867471.90306: done getting next task for host managed_node1 25039 1726867471.90310: ^ task is: TASK: Delete veth interface {{ interface }} 25039 1726867471.90313: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867471.90317: getting variables 25039 1726867471.90318: in VariableManager get_vars() 25039 1726867471.90350: Calling all_inventory to load vars for managed_node1 25039 1726867471.90352: Calling groups_inventory to load vars for managed_node1 25039 1726867471.90354: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867471.90363: Calling all_plugins_play to load vars for managed_node1 25039 1726867471.90365: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867471.90368: Calling groups_plugins_play to load vars for managed_node1 25039 1726867471.91125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867471.91994: done with get_vars() 25039 1726867471.92011: done getting variables 25039 1726867471.92054: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867471.92135: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 17:24:31 -0400 (0:00:00.034) 0:00:29.447 ****** 25039 1726867471.92157: entering _queue_task() for managed_node1/command 25039 1726867471.92371: worker is 1 (out of 1 available) 25039 1726867471.92387: exiting _queue_task() for managed_node1/command 25039 1726867471.92399: done queuing things up, now waiting for results queue to drain 25039 1726867471.92401: waiting for pending results... 25039 1726867471.92568: running TaskExecutor() for managed_node1/TASK: Delete veth interface veth0 25039 1726867471.92640: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005d2 25039 1726867471.92650: variable 'ansible_search_path' from source: unknown 25039 1726867471.92655: variable 'ansible_search_path' from source: unknown 25039 1726867471.92683: calling self._execute() 25039 1726867471.92754: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.92758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.92767: variable 'omit' from source: magic vars 25039 1726867471.93021: variable 'ansible_distribution_major_version' from source: facts 25039 1726867471.93031: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867471.93157: variable 'type' from source: play vars 25039 1726867471.93161: variable 'state' from source: include params 25039 1726867471.93166: variable 'interface' from source: play vars 25039 1726867471.93170: variable 'current_interfaces' from source: set_fact 25039 1726867471.93183: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 25039 1726867471.93186: variable 'omit' from source: magic vars 25039 1726867471.93216: variable 'omit' from source: magic vars 25039 1726867471.93286: variable 'interface' from source: play vars 25039 1726867471.93300: variable 'omit' from source: magic vars 25039 1726867471.93332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867471.93358: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867471.93373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867471.93389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867471.93404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867471.93426: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867471.93429: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.93431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.93499: Set connection var ansible_shell_executable to /bin/sh 25039 1726867471.93503: Set connection var ansible_timeout to 10 25039 1726867471.93507: Set connection var ansible_shell_type to sh 25039 1726867471.93512: Set connection var ansible_connection to ssh 25039 1726867471.93522: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867471.93525: Set connection var ansible_pipelining to False 25039 1726867471.93541: variable 'ansible_shell_executable' from source: unknown 25039 1726867471.93544: variable 'ansible_connection' from source: unknown 25039 1726867471.93546: variable 'ansible_module_compression' from source: unknown 25039 1726867471.93549: variable 'ansible_shell_type' from source: unknown 25039 1726867471.93551: variable 'ansible_shell_executable' from source: unknown 25039 1726867471.93554: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867471.93558: variable 'ansible_pipelining' from source: unknown 25039 1726867471.93560: variable 'ansible_timeout' from source: unknown 25039 1726867471.93565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867471.93664: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867471.93671: variable 'omit' from source: magic vars 25039 1726867471.93676: starting attempt loop 25039 1726867471.93681: running the handler 25039 1726867471.93693: _low_level_execute_command(): starting 25039 1726867471.93700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867471.94214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867471.94219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.94223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.94273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867471.94276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867471.94281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.94340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.96024: stdout chunk (state=3): >>>/root <<< 25039 1726867471.96126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867471.96151: stderr chunk (state=3): >>><<< 25039 1726867471.96155: stdout chunk (state=3): >>><<< 25039 1726867471.96173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867471.96185: _low_level_execute_command(): starting 25039 1726867471.96190: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851 `" && echo ansible-tmp-1726867471.961724-26415-23398023327851="` echo /root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851 `" ) && sleep 0' 25039 1726867471.96606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.96619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867471.96622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867471.96624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867471.96626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.96666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.96722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867471.98615: stdout chunk (state=3): >>>ansible-tmp-1726867471.961724-26415-23398023327851=/root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851 <<< 25039 1726867471.98724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867471.98745: stderr chunk (state=3): >>><<< 25039 1726867471.98748: stdout chunk (state=3): >>><<< 25039 1726867471.98761: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867471.961724-26415-23398023327851=/root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867471.98785: variable 'ansible_module_compression' from source: unknown 25039 1726867471.98825: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867471.98851: variable 'ansible_facts' from source: unknown 25039 1726867471.98915: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/AnsiballZ_command.py 25039 1726867471.98995: Sending initial data 25039 1726867471.98999: Sent initial data (154 bytes) 25039 1726867471.99422: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867471.99426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867471.99428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25039 1726867471.99430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867471.99432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867471.99480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867471.99488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867471.99533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.01085: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867472.01088: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867472.01127: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867472.01180: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmph18qhgqi /root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/AnsiballZ_command.py <<< 25039 1726867472.01182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/AnsiballZ_command.py" <<< 25039 1726867472.01219: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmph18qhgqi" to remote "/root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/AnsiballZ_command.py" <<< 25039 1726867472.01224: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/AnsiballZ_command.py" <<< 25039 1726867472.01772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867472.01807: stderr chunk (state=3): >>><<< 25039 1726867472.01813: stdout chunk (state=3): >>><<< 25039 1726867472.01850: done transferring module to remote 25039 1726867472.01858: _low_level_execute_command(): starting 25039 1726867472.01860: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/ /root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/AnsiballZ_command.py && sleep 0' 25039 1726867472.02279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.02282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 25039 1726867472.02285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867472.02290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 25039 1726867472.02292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.02337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867472.02340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.02389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.04284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867472.04288: stdout chunk (state=3): >>><<< 25039 1726867472.04290: stderr chunk (state=3): >>><<< 25039 1726867472.04292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867472.04294: _low_level_execute_command(): starting 25039 1726867472.04296: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/AnsiballZ_command.py && sleep 0' 25039 1726867472.04781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.04784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.04786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867472.04788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867472.04790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.04833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867472.04837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.04904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.21394: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 17:24:32.198894", "end": "2024-09-20 17:24:32.208790", "delta": "0:00:00.009896", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867472.23460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867472.23485: stderr chunk (state=3): >>><<< 25039 1726867472.23488: stdout chunk (state=3): >>><<< 25039 1726867472.23503: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 17:24:32.198894", "end": "2024-09-20 17:24:32.208790", "delta": "0:00:00.009896", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867472.23533: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867472.23540: _low_level_execute_command(): starting 25039 1726867472.23546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867471.961724-26415-23398023327851/ > /dev/null 2>&1 && sleep 0' 25039 1726867472.23948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.23951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867472.23982: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.23985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.23987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.24038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867472.24042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.24093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.25926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867472.25948: stderr chunk (state=3): >>><<< 25039 1726867472.25951: stdout chunk (state=3): >>><<< 25039 1726867472.25968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867472.25973: handler run complete 25039 1726867472.25992: Evaluated conditional (False): False 25039 1726867472.26000: attempt loop complete, returning result 25039 1726867472.26002: _execute() done 25039 1726867472.26005: dumping result to json 25039 1726867472.26012: done dumping result, returning 25039 1726867472.26018: done running TaskExecutor() for managed_node1/TASK: Delete veth interface veth0 [0affcac9-a3a5-3ddc-7272-0000000005d2] 25039 1726867472.26022: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d2 25039 1726867472.26118: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d2 25039 1726867472.26120: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.009896", "end": "2024-09-20 17:24:32.208790", "rc": 0, "start": "2024-09-20 17:24:32.198894" } 25039 1726867472.26184: no more pending results, returning what we have 25039 1726867472.26187: results queue empty 25039 1726867472.26188: checking for any_errors_fatal 25039 1726867472.26196: done checking for any_errors_fatal 25039 1726867472.26197: checking for max_fail_percentage 25039 1726867472.26199: done checking for max_fail_percentage 25039 1726867472.26199: checking to see if all hosts have failed and the running result is not ok 25039 1726867472.26200: done checking to see if all hosts have failed 25039 1726867472.26201: getting the remaining hosts for this loop 25039 1726867472.26202: done getting the remaining hosts for this loop 25039 1726867472.26205: getting the next task for host managed_node1 25039 1726867472.26216: done getting next task for host managed_node1 25039 1726867472.26218: ^ task is: TASK: Create dummy interface {{ interface }} 25039 1726867472.26222: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867472.26226: getting variables 25039 1726867472.26227: in VariableManager get_vars() 25039 1726867472.26270: Calling all_inventory to load vars for managed_node1 25039 1726867472.26273: Calling groups_inventory to load vars for managed_node1 25039 1726867472.26275: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867472.26290: Calling all_plugins_play to load vars for managed_node1 25039 1726867472.26294: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867472.26296: Calling groups_plugins_play to load vars for managed_node1 25039 1726867472.27195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867472.28064: done with get_vars() 25039 1726867472.28081: done getting variables 25039 1726867472.28127: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867472.28205: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 17:24:32 -0400 (0:00:00.360) 0:00:29.807 ****** 25039 1726867472.28230: entering _queue_task() for managed_node1/command 25039 1726867472.28454: worker is 1 (out of 1 available) 25039 1726867472.28465: exiting _queue_task() for managed_node1/command 25039 1726867472.28481: done queuing things up, now waiting for results queue to drain 25039 1726867472.28483: waiting for pending results... 25039 1726867472.28648: running TaskExecutor() for managed_node1/TASK: Create dummy interface veth0 25039 1726867472.28722: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005d3 25039 1726867472.28733: variable 'ansible_search_path' from source: unknown 25039 1726867472.28737: variable 'ansible_search_path' from source: unknown 25039 1726867472.28764: calling self._execute() 25039 1726867472.28838: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867472.28843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867472.28852: variable 'omit' from source: magic vars 25039 1726867472.29111: variable 'ansible_distribution_major_version' from source: facts 25039 1726867472.29119: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867472.29250: variable 'type' from source: play vars 25039 1726867472.29255: variable 'state' from source: include params 25039 1726867472.29258: variable 'interface' from source: play vars 25039 1726867472.29261: variable 'current_interfaces' from source: set_fact 25039 1726867472.29272: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 25039 1726867472.29275: when evaluation is False, skipping this task 25039 1726867472.29279: _execute() done 25039 1726867472.29282: dumping result to json 25039 1726867472.29284: done dumping result, returning 25039 1726867472.29287: done running TaskExecutor() for managed_node1/TASK: Create dummy interface veth0 [0affcac9-a3a5-3ddc-7272-0000000005d3] 25039 1726867472.29292: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d3 25039 1726867472.29368: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d3 25039 1726867472.29373: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867472.29426: no more pending results, returning what we have 25039 1726867472.29429: results queue empty 25039 1726867472.29430: checking for any_errors_fatal 25039 1726867472.29436: done checking for any_errors_fatal 25039 1726867472.29437: checking for max_fail_percentage 25039 1726867472.29439: done checking for max_fail_percentage 25039 1726867472.29439: checking to see if all hosts have failed and the running result is not ok 25039 1726867472.29440: done checking to see if all hosts have failed 25039 1726867472.29441: getting the remaining hosts for this loop 25039 1726867472.29442: done getting the remaining hosts for this loop 25039 1726867472.29445: getting the next task for host managed_node1 25039 1726867472.29450: done getting next task for host managed_node1 25039 1726867472.29452: ^ task is: TASK: Delete dummy interface {{ interface }} 25039 1726867472.29455: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867472.29458: getting variables 25039 1726867472.29460: in VariableManager get_vars() 25039 1726867472.29498: Calling all_inventory to load vars for managed_node1 25039 1726867472.29501: Calling groups_inventory to load vars for managed_node1 25039 1726867472.29503: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867472.29515: Calling all_plugins_play to load vars for managed_node1 25039 1726867472.29518: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867472.29520: Calling groups_plugins_play to load vars for managed_node1 25039 1726867472.30252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867472.34667: done with get_vars() 25039 1726867472.34686: done getting variables 25039 1726867472.34721: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867472.34783: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 17:24:32 -0400 (0:00:00.065) 0:00:29.873 ****** 25039 1726867472.34803: entering _queue_task() for managed_node1/command 25039 1726867472.35102: worker is 1 (out of 1 available) 25039 1726867472.35116: exiting _queue_task() for managed_node1/command 25039 1726867472.35129: done queuing things up, now waiting for results queue to drain 25039 1726867472.35130: waiting for pending results... 25039 1726867472.35593: running TaskExecutor() for managed_node1/TASK: Delete dummy interface veth0 25039 1726867472.35598: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005d4 25039 1726867472.35602: variable 'ansible_search_path' from source: unknown 25039 1726867472.35604: variable 'ansible_search_path' from source: unknown 25039 1726867472.35607: calling self._execute() 25039 1726867472.35689: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867472.35701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867472.35721: variable 'omit' from source: magic vars 25039 1726867472.36089: variable 'ansible_distribution_major_version' from source: facts 25039 1726867472.36105: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867472.36317: variable 'type' from source: play vars 25039 1726867472.36329: variable 'state' from source: include params 25039 1726867472.36338: variable 'interface' from source: play vars 25039 1726867472.36347: variable 'current_interfaces' from source: set_fact 25039 1726867472.36361: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 25039 1726867472.36368: when evaluation is False, skipping this task 25039 1726867472.36381: _execute() done 25039 1726867472.36389: dumping result to json 25039 1726867472.36397: done dumping result, returning 25039 1726867472.36407: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface veth0 [0affcac9-a3a5-3ddc-7272-0000000005d4] 25039 1726867472.36420: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d4 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867472.36669: no more pending results, returning what we have 25039 1726867472.36673: results queue empty 25039 1726867472.36675: checking for any_errors_fatal 25039 1726867472.36684: done checking for any_errors_fatal 25039 1726867472.36685: checking for max_fail_percentage 25039 1726867472.36687: done checking for max_fail_percentage 25039 1726867472.36688: checking to see if all hosts have failed and the running result is not ok 25039 1726867472.36689: done checking to see if all hosts have failed 25039 1726867472.36690: getting the remaining hosts for this loop 25039 1726867472.36691: done getting the remaining hosts for this loop 25039 1726867472.36695: getting the next task for host managed_node1 25039 1726867472.36703: done getting next task for host managed_node1 25039 1726867472.36705: ^ task is: TASK: Create tap interface {{ interface }} 25039 1726867472.36711: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867472.36716: getting variables 25039 1726867472.36718: in VariableManager get_vars() 25039 1726867472.36762: Calling all_inventory to load vars for managed_node1 25039 1726867472.36765: Calling groups_inventory to load vars for managed_node1 25039 1726867472.36768: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867472.36985: Calling all_plugins_play to load vars for managed_node1 25039 1726867472.36989: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867472.36994: Calling groups_plugins_play to load vars for managed_node1 25039 1726867472.37691: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d4 25039 1726867472.37694: WORKER PROCESS EXITING 25039 1726867472.38299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867472.39626: done with get_vars() 25039 1726867472.39643: done getting variables 25039 1726867472.39693: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867472.39770: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 17:24:32 -0400 (0:00:00.049) 0:00:29.923 ****** 25039 1726867472.39794: entering _queue_task() for managed_node1/command 25039 1726867472.40029: worker is 1 (out of 1 available) 25039 1726867472.40043: exiting _queue_task() for managed_node1/command 25039 1726867472.40056: done queuing things up, now waiting for results queue to drain 25039 1726867472.40058: waiting for pending results... 25039 1726867472.40337: running TaskExecutor() for managed_node1/TASK: Create tap interface veth0 25039 1726867472.40448: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005d5 25039 1726867472.40468: variable 'ansible_search_path' from source: unknown 25039 1726867472.40474: variable 'ansible_search_path' from source: unknown 25039 1726867472.40520: calling self._execute() 25039 1726867472.40622: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867472.40637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867472.40653: variable 'omit' from source: magic vars 25039 1726867472.40976: variable 'ansible_distribution_major_version' from source: facts 25039 1726867472.40987: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867472.41126: variable 'type' from source: play vars 25039 1726867472.41130: variable 'state' from source: include params 25039 1726867472.41135: variable 'interface' from source: play vars 25039 1726867472.41138: variable 'current_interfaces' from source: set_fact 25039 1726867472.41146: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 25039 1726867472.41149: when evaluation is False, skipping this task 25039 1726867472.41153: _execute() done 25039 1726867472.41156: dumping result to json 25039 1726867472.41158: done dumping result, returning 25039 1726867472.41170: done running TaskExecutor() for managed_node1/TASK: Create tap interface veth0 [0affcac9-a3a5-3ddc-7272-0000000005d5] 25039 1726867472.41173: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d5 25039 1726867472.41243: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d5 25039 1726867472.41246: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867472.41315: no more pending results, returning what we have 25039 1726867472.41318: results queue empty 25039 1726867472.41319: checking for any_errors_fatal 25039 1726867472.41324: done checking for any_errors_fatal 25039 1726867472.41324: checking for max_fail_percentage 25039 1726867472.41326: done checking for max_fail_percentage 25039 1726867472.41326: checking to see if all hosts have failed and the running result is not ok 25039 1726867472.41327: done checking to see if all hosts have failed 25039 1726867472.41328: getting the remaining hosts for this loop 25039 1726867472.41329: done getting the remaining hosts for this loop 25039 1726867472.41332: getting the next task for host managed_node1 25039 1726867472.41338: done getting next task for host managed_node1 25039 1726867472.41342: ^ task is: TASK: Delete tap interface {{ interface }} 25039 1726867472.41347: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867472.41351: getting variables 25039 1726867472.41352: in VariableManager get_vars() 25039 1726867472.41385: Calling all_inventory to load vars for managed_node1 25039 1726867472.41388: Calling groups_inventory to load vars for managed_node1 25039 1726867472.41390: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867472.41399: Calling all_plugins_play to load vars for managed_node1 25039 1726867472.41401: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867472.41404: Calling groups_plugins_play to load vars for managed_node1 25039 1726867472.42854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867472.44788: done with get_vars() 25039 1726867472.44815: done getting variables 25039 1726867472.44871: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25039 1726867472.44993: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 17:24:32 -0400 (0:00:00.052) 0:00:29.975 ****** 25039 1726867472.45031: entering _queue_task() for managed_node1/command 25039 1726867472.45334: worker is 1 (out of 1 available) 25039 1726867472.45348: exiting _queue_task() for managed_node1/command 25039 1726867472.45359: done queuing things up, now waiting for results queue to drain 25039 1726867472.45361: waiting for pending results... 25039 1726867472.45540: running TaskExecutor() for managed_node1/TASK: Delete tap interface veth0 25039 1726867472.45661: in run() - task 0affcac9-a3a5-3ddc-7272-0000000005d6 25039 1726867472.45666: variable 'ansible_search_path' from source: unknown 25039 1726867472.45669: variable 'ansible_search_path' from source: unknown 25039 1726867472.45672: calling self._execute() 25039 1726867472.45733: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867472.45737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867472.45748: variable 'omit' from source: magic vars 25039 1726867472.46016: variable 'ansible_distribution_major_version' from source: facts 25039 1726867472.46026: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867472.46158: variable 'type' from source: play vars 25039 1726867472.46162: variable 'state' from source: include params 25039 1726867472.46166: variable 'interface' from source: play vars 25039 1726867472.46170: variable 'current_interfaces' from source: set_fact 25039 1726867472.46181: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 25039 1726867472.46184: when evaluation is False, skipping this task 25039 1726867472.46186: _execute() done 25039 1726867472.46189: dumping result to json 25039 1726867472.46192: done dumping result, returning 25039 1726867472.46200: done running TaskExecutor() for managed_node1/TASK: Delete tap interface veth0 [0affcac9-a3a5-3ddc-7272-0000000005d6] 25039 1726867472.46203: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d6 25039 1726867472.46278: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000005d6 25039 1726867472.46282: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25039 1726867472.46331: no more pending results, returning what we have 25039 1726867472.46335: results queue empty 25039 1726867472.46336: checking for any_errors_fatal 25039 1726867472.46342: done checking for any_errors_fatal 25039 1726867472.46343: checking for max_fail_percentage 25039 1726867472.46344: done checking for max_fail_percentage 25039 1726867472.46345: checking to see if all hosts have failed and the running result is not ok 25039 1726867472.46346: done checking to see if all hosts have failed 25039 1726867472.46347: getting the remaining hosts for this loop 25039 1726867472.46348: done getting the remaining hosts for this loop 25039 1726867472.46351: getting the next task for host managed_node1 25039 1726867472.46359: done getting next task for host managed_node1 25039 1726867472.46361: ^ task is: TASK: Clean up namespace 25039 1726867472.46364: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867472.46369: getting variables 25039 1726867472.46370: in VariableManager get_vars() 25039 1726867472.46407: Calling all_inventory to load vars for managed_node1 25039 1726867472.46412: Calling groups_inventory to load vars for managed_node1 25039 1726867472.46414: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867472.46425: Calling all_plugins_play to load vars for managed_node1 25039 1726867472.46427: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867472.46430: Calling groups_plugins_play to load vars for managed_node1 25039 1726867472.48100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867472.49282: done with get_vars() 25039 1726867472.49296: done getting variables 25039 1726867472.49340: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Friday 20 September 2024 17:24:32 -0400 (0:00:00.043) 0:00:30.019 ****** 25039 1726867472.49359: entering _queue_task() for managed_node1/command 25039 1726867472.49564: worker is 1 (out of 1 available) 25039 1726867472.49579: exiting _queue_task() for managed_node1/command 25039 1726867472.49592: done queuing things up, now waiting for results queue to drain 25039 1726867472.49593: waiting for pending results... 25039 1726867472.49755: running TaskExecutor() for managed_node1/TASK: Clean up namespace 25039 1726867472.49814: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000b4 25039 1726867472.49820: variable 'ansible_search_path' from source: unknown 25039 1726867472.49855: calling self._execute() 25039 1726867472.49934: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867472.49938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867472.49945: variable 'omit' from source: magic vars 25039 1726867472.50382: variable 'ansible_distribution_major_version' from source: facts 25039 1726867472.50385: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867472.50388: variable 'omit' from source: magic vars 25039 1726867472.50390: variable 'omit' from source: magic vars 25039 1726867472.50392: variable 'omit' from source: magic vars 25039 1726867472.50395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867472.50432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867472.50518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867472.50558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867472.50574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867472.50627: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867472.50637: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867472.50646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867472.50775: Set connection var ansible_shell_executable to /bin/sh 25039 1726867472.50793: Set connection var ansible_timeout to 10 25039 1726867472.50804: Set connection var ansible_shell_type to sh 25039 1726867472.50815: Set connection var ansible_connection to ssh 25039 1726867472.50840: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867472.50850: Set connection var ansible_pipelining to False 25039 1726867472.50883: variable 'ansible_shell_executable' from source: unknown 25039 1726867472.50893: variable 'ansible_connection' from source: unknown 25039 1726867472.50900: variable 'ansible_module_compression' from source: unknown 25039 1726867472.50911: variable 'ansible_shell_type' from source: unknown 25039 1726867472.50919: variable 'ansible_shell_executable' from source: unknown 25039 1726867472.50926: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867472.50937: variable 'ansible_pipelining' from source: unknown 25039 1726867472.51052: variable 'ansible_timeout' from source: unknown 25039 1726867472.51056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867472.51115: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867472.51133: variable 'omit' from source: magic vars 25039 1726867472.51143: starting attempt loop 25039 1726867472.51150: running the handler 25039 1726867472.51179: _low_level_execute_command(): starting 25039 1726867472.51192: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867472.51982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.52028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867472.52047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.52100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.53795: stdout chunk (state=3): >>>/root <<< 25039 1726867472.53973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867472.53976: stdout chunk (state=3): >>><<< 25039 1726867472.53980: stderr chunk (state=3): >>><<< 25039 1726867472.54000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867472.54021: _low_level_execute_command(): starting 25039 1726867472.54029: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150 `" && echo ansible-tmp-1726867472.5400627-26444-88007960599150="` echo /root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150 `" ) && sleep 0' 25039 1726867472.54615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.54672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867472.54675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867472.54731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.54811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.56790: stdout chunk (state=3): >>>ansible-tmp-1726867472.5400627-26444-88007960599150=/root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150 <<< 25039 1726867472.56899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867472.56920: stderr chunk (state=3): >>><<< 25039 1726867472.56924: stdout chunk (state=3): >>><<< 25039 1726867472.56964: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867472.5400627-26444-88007960599150=/root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867472.56986: variable 'ansible_module_compression' from source: unknown 25039 1726867472.57108: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867472.57111: variable 'ansible_facts' from source: unknown 25039 1726867472.57283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/AnsiballZ_command.py 25039 1726867472.57320: Sending initial data 25039 1726867472.57323: Sent initial data (155 bytes) 25039 1726867472.57939: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.57959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.57971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.58011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867472.58023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.58079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.59623: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25039 1726867472.59627: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867472.59669: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867472.59709: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp38457w9w /root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/AnsiballZ_command.py <<< 25039 1726867472.59714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/AnsiballZ_command.py" <<< 25039 1726867472.59749: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmp38457w9w" to remote "/root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/AnsiballZ_command.py" <<< 25039 1726867472.59754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/AnsiballZ_command.py" <<< 25039 1726867472.60307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867472.60339: stderr chunk (state=3): >>><<< 25039 1726867472.60343: stdout chunk (state=3): >>><<< 25039 1726867472.60356: done transferring module to remote 25039 1726867472.60364: _low_level_execute_command(): starting 25039 1726867472.60367: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/ /root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/AnsiballZ_command.py && sleep 0' 25039 1726867472.60757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.60760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.60763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 25039 1726867472.60766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867472.60767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.60813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867472.60817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.60873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.62623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867472.62642: stderr chunk (state=3): >>><<< 25039 1726867472.62645: stdout chunk (state=3): >>><<< 25039 1726867472.62656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867472.62659: _low_level_execute_command(): starting 25039 1726867472.62663: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/AnsiballZ_command.py && sleep 0' 25039 1726867472.63062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.63065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.63067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867472.63069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.63115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867472.63118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.63180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.78844: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 17:24:32.781287", "end": "2024-09-20 17:24:32.786489", "delta": "0:00:00.005202", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867472.80485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867472.80489: stdout chunk (state=3): >>><<< 25039 1726867472.80491: stderr chunk (state=3): >>><<< 25039 1726867472.80513: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 17:24:32.781287", "end": "2024-09-20 17:24:32.786489", "delta": "0:00:00.005202", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867472.80783: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867472.80787: _low_level_execute_command(): starting 25039 1726867472.80790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867472.5400627-26444-88007960599150/ > /dev/null 2>&1 && sleep 0' 25039 1726867472.81882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867472.81898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867472.82106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867472.82118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867472.82170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867472.84072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867472.84076: stdout chunk (state=3): >>><<< 25039 1726867472.84080: stderr chunk (state=3): >>><<< 25039 1726867472.84097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867472.84290: handler run complete 25039 1726867472.84293: Evaluated conditional (False): False 25039 1726867472.84296: attempt loop complete, returning result 25039 1726867472.84298: _execute() done 25039 1726867472.84300: dumping result to json 25039 1726867472.84302: done dumping result, returning 25039 1726867472.84303: done running TaskExecutor() for managed_node1/TASK: Clean up namespace [0affcac9-a3a5-3ddc-7272-0000000000b4] 25039 1726867472.84305: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000b4 25039 1726867472.84373: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000b4 25039 1726867472.84376: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.005202", "end": "2024-09-20 17:24:32.786489", "rc": 0, "start": "2024-09-20 17:24:32.781287" } 25039 1726867472.84646: no more pending results, returning what we have 25039 1726867472.84650: results queue empty 25039 1726867472.84651: checking for any_errors_fatal 25039 1726867472.84656: done checking for any_errors_fatal 25039 1726867472.84657: checking for max_fail_percentage 25039 1726867472.84659: done checking for max_fail_percentage 25039 1726867472.84660: checking to see if all hosts have failed and the running result is not ok 25039 1726867472.84661: done checking to see if all hosts have failed 25039 1726867472.84662: getting the remaining hosts for this loop 25039 1726867472.84663: done getting the remaining hosts for this loop 25039 1726867472.84667: getting the next task for host managed_node1 25039 1726867472.84673: done getting next task for host managed_node1 25039 1726867472.84680: ^ task is: TASK: Verify network state restored to default 25039 1726867472.84682: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867472.84686: getting variables 25039 1726867472.84688: in VariableManager get_vars() 25039 1726867472.84732: Calling all_inventory to load vars for managed_node1 25039 1726867472.84735: Calling groups_inventory to load vars for managed_node1 25039 1726867472.84738: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867472.84749: Calling all_plugins_play to load vars for managed_node1 25039 1726867472.84753: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867472.84756: Calling groups_plugins_play to load vars for managed_node1 25039 1726867472.87109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867472.88256: done with get_vars() 25039 1726867472.88274: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Friday 20 September 2024 17:24:32 -0400 (0:00:00.389) 0:00:30.409 ****** 25039 1726867472.88341: entering _queue_task() for managed_node1/include_tasks 25039 1726867472.88567: worker is 1 (out of 1 available) 25039 1726867472.88579: exiting _queue_task() for managed_node1/include_tasks 25039 1726867472.88591: done queuing things up, now waiting for results queue to drain 25039 1726867472.88593: waiting for pending results... 25039 1726867472.88754: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 25039 1726867472.88822: in run() - task 0affcac9-a3a5-3ddc-7272-0000000000b5 25039 1726867472.88833: variable 'ansible_search_path' from source: unknown 25039 1726867472.88862: calling self._execute() 25039 1726867472.88939: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867472.88946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867472.88955: variable 'omit' from source: magic vars 25039 1726867472.89383: variable 'ansible_distribution_major_version' from source: facts 25039 1726867472.89386: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867472.89388: _execute() done 25039 1726867472.89390: dumping result to json 25039 1726867472.89391: done dumping result, returning 25039 1726867472.89393: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affcac9-a3a5-3ddc-7272-0000000000b5] 25039 1726867472.89395: sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000b5 25039 1726867472.89458: done sending task result for task 0affcac9-a3a5-3ddc-7272-0000000000b5 25039 1726867472.89460: WORKER PROCESS EXITING 25039 1726867472.89498: no more pending results, returning what we have 25039 1726867472.89503: in VariableManager get_vars() 25039 1726867472.89546: Calling all_inventory to load vars for managed_node1 25039 1726867472.89549: Calling groups_inventory to load vars for managed_node1 25039 1726867472.89552: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867472.89564: Calling all_plugins_play to load vars for managed_node1 25039 1726867472.89567: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867472.89569: Calling groups_plugins_play to load vars for managed_node1 25039 1726867472.91151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867472.93553: done with get_vars() 25039 1726867472.93571: variable 'ansible_search_path' from source: unknown 25039 1726867472.93588: we have included files to process 25039 1726867472.93589: generating all_blocks data 25039 1726867472.93591: done generating all_blocks data 25039 1726867472.93596: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25039 1726867472.93597: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25039 1726867472.93600: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25039 1726867472.94355: done processing included file 25039 1726867472.94357: iterating over new_blocks loaded from include file 25039 1726867472.94358: in VariableManager get_vars() 25039 1726867472.94379: done with get_vars() 25039 1726867472.94382: filtering new block on tags 25039 1726867472.94400: done filtering new block on tags 25039 1726867472.94403: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 25039 1726867472.94411: extending task lists for all hosts with included blocks 25039 1726867472.98920: done extending task lists 25039 1726867472.98922: done processing included files 25039 1726867472.98923: results queue empty 25039 1726867472.98924: checking for any_errors_fatal 25039 1726867472.98928: done checking for any_errors_fatal 25039 1726867472.98929: checking for max_fail_percentage 25039 1726867472.98929: done checking for max_fail_percentage 25039 1726867472.98930: checking to see if all hosts have failed and the running result is not ok 25039 1726867472.98931: done checking to see if all hosts have failed 25039 1726867472.98932: getting the remaining hosts for this loop 25039 1726867472.98933: done getting the remaining hosts for this loop 25039 1726867472.98935: getting the next task for host managed_node1 25039 1726867472.98938: done getting next task for host managed_node1 25039 1726867472.98940: ^ task is: TASK: Check routes and DNS 25039 1726867472.98943: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867472.98945: getting variables 25039 1726867472.98946: in VariableManager get_vars() 25039 1726867472.98959: Calling all_inventory to load vars for managed_node1 25039 1726867472.98961: Calling groups_inventory to load vars for managed_node1 25039 1726867472.98963: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867472.98968: Calling all_plugins_play to load vars for managed_node1 25039 1726867472.98970: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867472.98972: Calling groups_plugins_play to load vars for managed_node1 25039 1726867473.00587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867473.02625: done with get_vars() 25039 1726867473.02648: done getting variables 25039 1726867473.02697: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 17:24:33 -0400 (0:00:00.143) 0:00:30.552 ****** 25039 1726867473.02731: entering _queue_task() for managed_node1/shell 25039 1726867473.03458: worker is 1 (out of 1 available) 25039 1726867473.03471: exiting _queue_task() for managed_node1/shell 25039 1726867473.03487: done queuing things up, now waiting for results queue to drain 25039 1726867473.03489: waiting for pending results... 25039 1726867473.03970: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 25039 1726867473.03976: in run() - task 0affcac9-a3a5-3ddc-7272-00000000075e 25039 1726867473.03982: variable 'ansible_search_path' from source: unknown 25039 1726867473.03984: variable 'ansible_search_path' from source: unknown 25039 1726867473.03987: calling self._execute() 25039 1726867473.04015: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867473.04021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867473.04032: variable 'omit' from source: magic vars 25039 1726867473.04415: variable 'ansible_distribution_major_version' from source: facts 25039 1726867473.04424: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867473.04438: variable 'omit' from source: magic vars 25039 1726867473.04482: variable 'omit' from source: magic vars 25039 1726867473.04518: variable 'omit' from source: magic vars 25039 1726867473.04562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867473.04600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867473.04683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867473.04687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867473.04690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867473.04693: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867473.04695: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867473.04697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867473.04797: Set connection var ansible_shell_executable to /bin/sh 25039 1726867473.04803: Set connection var ansible_timeout to 10 25039 1726867473.04812: Set connection var ansible_shell_type to sh 25039 1726867473.04814: Set connection var ansible_connection to ssh 25039 1726867473.04819: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867473.04826: Set connection var ansible_pipelining to False 25039 1726867473.04850: variable 'ansible_shell_executable' from source: unknown 25039 1726867473.04854: variable 'ansible_connection' from source: unknown 25039 1726867473.04857: variable 'ansible_module_compression' from source: unknown 25039 1726867473.04860: variable 'ansible_shell_type' from source: unknown 25039 1726867473.04862: variable 'ansible_shell_executable' from source: unknown 25039 1726867473.04936: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867473.04939: variable 'ansible_pipelining' from source: unknown 25039 1726867473.04942: variable 'ansible_timeout' from source: unknown 25039 1726867473.04945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867473.05021: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867473.05030: variable 'omit' from source: magic vars 25039 1726867473.05035: starting attempt loop 25039 1726867473.05043: running the handler 25039 1726867473.05047: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867473.05071: _low_level_execute_command(): starting 25039 1726867473.05083: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867473.05802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867473.05805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867473.05811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867473.05815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867473.05883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867473.05887: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.05920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.05935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.05952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.06031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.07739: stdout chunk (state=3): >>>/root <<< 25039 1726867473.07903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.07906: stdout chunk (state=3): >>><<< 25039 1726867473.07914: stderr chunk (state=3): >>><<< 25039 1726867473.07938: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867473.08031: _low_level_execute_command(): starting 25039 1726867473.08036: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811 `" && echo ansible-tmp-1726867473.0794423-26489-255465612140811="` echo /root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811 `" ) && sleep 0' 25039 1726867473.08671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.08705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.08725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.08806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.10699: stdout chunk (state=3): >>>ansible-tmp-1726867473.0794423-26489-255465612140811=/root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811 <<< 25039 1726867473.10866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.10869: stdout chunk (state=3): >>><<< 25039 1726867473.10872: stderr chunk (state=3): >>><<< 25039 1726867473.10890: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867473.0794423-26489-255465612140811=/root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867473.11083: variable 'ansible_module_compression' from source: unknown 25039 1726867473.11086: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867473.11088: variable 'ansible_facts' from source: unknown 25039 1726867473.11124: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/AnsiballZ_command.py 25039 1726867473.11301: Sending initial data 25039 1726867473.11304: Sent initial data (156 bytes) 25039 1726867473.11847: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867473.11892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867473.11979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.12000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.12016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.12090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.13639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867473.13705: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867473.13751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpentv0_xd /root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/AnsiballZ_command.py <<< 25039 1726867473.13761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/AnsiballZ_command.py" <<< 25039 1726867473.13801: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpentv0_xd" to remote "/root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/AnsiballZ_command.py" <<< 25039 1726867473.14643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.14646: stderr chunk (state=3): >>><<< 25039 1726867473.14649: stdout chunk (state=3): >>><<< 25039 1726867473.14657: done transferring module to remote 25039 1726867473.14670: _low_level_execute_command(): starting 25039 1726867473.14681: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/ /root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/AnsiballZ_command.py && sleep 0' 25039 1726867473.15282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867473.15393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.15422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.15438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.15514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.17293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.17344: stderr chunk (state=3): >>><<< 25039 1726867473.17352: stdout chunk (state=3): >>><<< 25039 1726867473.17371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867473.17391: _low_level_execute_command(): starting 25039 1726867473.17470: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/AnsiballZ_command.py && sleep 0' 25039 1726867473.17992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867473.18016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867473.18032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867473.18090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.18152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.18170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.18194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.18275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.34495: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2897sec preferred_lft 2897sec\n inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:24:33.331845", "end": "2024-09-20 17:24:33.340775", "delta": "0:00:00.008930", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867473.35986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867473.36286: stdout chunk (state=3): >>><<< 25039 1726867473.36290: stderr chunk (state=3): >>><<< 25039 1726867473.36292: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2897sec preferred_lft 2897sec\n inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:24:33.331845", "end": "2024-09-20 17:24:33.340775", "delta": "0:00:00.008930", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867473.36295: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867473.36302: _low_level_execute_command(): starting 25039 1726867473.36304: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867473.0794423-26489-255465612140811/ > /dev/null 2>&1 && sleep 0' 25039 1726867473.37562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867473.37670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.37833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.37906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.39900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.39904: stdout chunk (state=3): >>><<< 25039 1726867473.39912: stderr chunk (state=3): >>><<< 25039 1726867473.39931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867473.39942: handler run complete 25039 1726867473.39968: Evaluated conditional (False): False 25039 1726867473.40024: attempt loop complete, returning result 25039 1726867473.40117: _execute() done 25039 1726867473.40120: dumping result to json 25039 1726867473.40122: done dumping result, returning 25039 1726867473.40124: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0affcac9-a3a5-3ddc-7272-00000000075e] 25039 1726867473.40127: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000075e ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008930", "end": "2024-09-20 17:24:33.340775", "rc": 0, "start": "2024-09-20 17:24:33.331845" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2897sec preferred_lft 2897sec inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 25039 1726867473.40472: no more pending results, returning what we have 25039 1726867473.40475: results queue empty 25039 1726867473.40476: checking for any_errors_fatal 25039 1726867473.40480: done checking for any_errors_fatal 25039 1726867473.40481: checking for max_fail_percentage 25039 1726867473.40483: done checking for max_fail_percentage 25039 1726867473.40484: checking to see if all hosts have failed and the running result is not ok 25039 1726867473.40485: done checking to see if all hosts have failed 25039 1726867473.40485: getting the remaining hosts for this loop 25039 1726867473.40487: done getting the remaining hosts for this loop 25039 1726867473.40490: getting the next task for host managed_node1 25039 1726867473.40497: done getting next task for host managed_node1 25039 1726867473.40499: ^ task is: TASK: Verify DNS and network connectivity 25039 1726867473.40502: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25039 1726867473.40506: getting variables 25039 1726867473.40511: in VariableManager get_vars() 25039 1726867473.40550: Calling all_inventory to load vars for managed_node1 25039 1726867473.40553: Calling groups_inventory to load vars for managed_node1 25039 1726867473.40555: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867473.40566: Calling all_plugins_play to load vars for managed_node1 25039 1726867473.40569: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867473.40572: Calling groups_plugins_play to load vars for managed_node1 25039 1726867473.41785: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000075e 25039 1726867473.41788: WORKER PROCESS EXITING 25039 1726867473.43107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867473.44762: done with get_vars() 25039 1726867473.44786: done getting variables 25039 1726867473.44848: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 17:24:33 -0400 (0:00:00.421) 0:00:30.974 ****** 25039 1726867473.44893: entering _queue_task() for managed_node1/shell 25039 1726867473.45244: worker is 1 (out of 1 available) 25039 1726867473.45256: exiting _queue_task() for managed_node1/shell 25039 1726867473.45268: done queuing things up, now waiting for results queue to drain 25039 1726867473.45270: waiting for pending results... 25039 1726867473.45571: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 25039 1726867473.45693: in run() - task 0affcac9-a3a5-3ddc-7272-00000000075f 25039 1726867473.45723: variable 'ansible_search_path' from source: unknown 25039 1726867473.45736: variable 'ansible_search_path' from source: unknown 25039 1726867473.45779: calling self._execute() 25039 1726867473.45890: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867473.45902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867473.45924: variable 'omit' from source: magic vars 25039 1726867473.46325: variable 'ansible_distribution_major_version' from source: facts 25039 1726867473.46343: Evaluated conditional (ansible_distribution_major_version != '6'): True 25039 1726867473.46500: variable 'ansible_facts' from source: unknown 25039 1726867473.47280: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 25039 1726867473.47291: variable 'omit' from source: magic vars 25039 1726867473.47334: variable 'omit' from source: magic vars 25039 1726867473.47371: variable 'omit' from source: magic vars 25039 1726867473.47419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25039 1726867473.47459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25039 1726867473.47495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25039 1726867473.47520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867473.47536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25039 1726867473.47568: variable 'inventory_hostname' from source: host vars for 'managed_node1' 25039 1726867473.47585: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867473.47593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867473.47695: Set connection var ansible_shell_executable to /bin/sh 25039 1726867473.47705: Set connection var ansible_timeout to 10 25039 1726867473.47716: Set connection var ansible_shell_type to sh 25039 1726867473.47722: Set connection var ansible_connection to ssh 25039 1726867473.47731: Set connection var ansible_module_compression to ZIP_DEFLATED 25039 1726867473.47739: Set connection var ansible_pipelining to False 25039 1726867473.47766: variable 'ansible_shell_executable' from source: unknown 25039 1726867473.47774: variable 'ansible_connection' from source: unknown 25039 1726867473.47783: variable 'ansible_module_compression' from source: unknown 25039 1726867473.47798: variable 'ansible_shell_type' from source: unknown 25039 1726867473.47805: variable 'ansible_shell_executable' from source: unknown 25039 1726867473.47815: variable 'ansible_host' from source: host vars for 'managed_node1' 25039 1726867473.47823: variable 'ansible_pipelining' from source: unknown 25039 1726867473.47830: variable 'ansible_timeout' from source: unknown 25039 1726867473.47838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 25039 1726867473.47983: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867473.48007: variable 'omit' from source: magic vars 25039 1726867473.48084: starting attempt loop 25039 1726867473.48087: running the handler 25039 1726867473.48091: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25039 1726867473.48093: _low_level_execute_command(): starting 25039 1726867473.48095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25039 1726867473.48825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867473.48895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.48956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.48975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.49143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.49216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.50911: stdout chunk (state=3): >>>/root <<< 25039 1726867473.51058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.51061: stdout chunk (state=3): >>><<< 25039 1726867473.51063: stderr chunk (state=3): >>><<< 25039 1726867473.51169: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867473.51172: _low_level_execute_command(): starting 25039 1726867473.51175: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767 `" && echo ansible-tmp-1726867473.5109344-26502-94525577219767="` echo /root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767 `" ) && sleep 0' 25039 1726867473.52403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867473.52430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.52557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.52570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.52697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.54624: stdout chunk (state=3): >>>ansible-tmp-1726867473.5109344-26502-94525577219767=/root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767 <<< 25039 1726867473.54627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.54661: stderr chunk (state=3): >>><<< 25039 1726867473.54664: stdout chunk (state=3): >>><<< 25039 1726867473.54691: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867473.5109344-26502-94525577219767=/root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867473.54884: variable 'ansible_module_compression' from source: unknown 25039 1726867473.54888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-250396hzkg1j8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25039 1726867473.54937: variable 'ansible_facts' from source: unknown 25039 1726867473.55179: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/AnsiballZ_command.py 25039 1726867473.55424: Sending initial data 25039 1726867473.55437: Sent initial data (155 bytes) 25039 1726867473.56749: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.56930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.57030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.58890: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25039 1726867473.58895: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25039 1726867473.58899: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25039 1726867473.58986: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpganwyali /root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/AnsiballZ_command.py <<< 25039 1726867473.58998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/AnsiballZ_command.py" <<< 25039 1726867473.59056: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-250396hzkg1j8/tmpganwyali" to remote "/root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/AnsiballZ_command.py" <<< 25039 1726867473.60204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.60210: stdout chunk (state=3): >>><<< 25039 1726867473.60213: stderr chunk (state=3): >>><<< 25039 1726867473.60336: done transferring module to remote 25039 1726867473.60339: _low_level_execute_command(): starting 25039 1726867473.60341: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/ /root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/AnsiballZ_command.py && sleep 0' 25039 1726867473.61692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.61859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.61868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.61895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.61984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.63841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.64086: stderr chunk (state=3): >>><<< 25039 1726867473.64090: stdout chunk (state=3): >>><<< 25039 1726867473.64093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867473.64095: _low_level_execute_command(): starting 25039 1726867473.64098: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/AnsiballZ_command.py && sleep 0' 25039 1726867473.64450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25039 1726867473.64460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867473.64472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867473.64493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867473.64506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867473.64515: stderr chunk (state=3): >>>debug2: match not found <<< 25039 1726867473.64524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.64631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25039 1726867473.64634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 25039 1726867473.64636: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25039 1726867473.64639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25039 1726867473.64640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25039 1726867473.64647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25039 1726867473.64653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 25039 1726867473.64655: stderr chunk (state=3): >>>debug2: match found <<< 25039 1726867473.64657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.64665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 25039 1726867473.64679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.64697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.64782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.88624: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6693 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15314 0 --:--:-- --:--:-- --:--:-- 16166", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:24:33.796612", "end": "2024-09-20 17:24:33.883736", "delta": "0:00:00.087124", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25039 1726867473.90484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 25039 1726867473.90488: stdout chunk (state=3): >>><<< 25039 1726867473.90490: stderr chunk (state=3): >>><<< 25039 1726867473.90494: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6693 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15314 0 --:--:-- --:--:-- --:--:-- 16166", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:24:33.796612", "end": "2024-09-20 17:24:33.883736", "delta": "0:00:00.087124", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 25039 1726867473.90503: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25039 1726867473.90505: _low_level_execute_command(): starting 25039 1726867473.90507: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867473.5109344-26502-94525577219767/ > /dev/null 2>&1 && sleep 0' 25039 1726867473.91295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25039 1726867473.91597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 25039 1726867473.91717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25039 1726867473.92097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25039 1726867473.93884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25039 1726867473.93887: stdout chunk (state=3): >>><<< 25039 1726867473.93890: stderr chunk (state=3): >>><<< 25039 1726867473.93892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25039 1726867473.93895: handler run complete 25039 1726867473.93897: Evaluated conditional (False): False 25039 1726867473.93899: attempt loop complete, returning result 25039 1726867473.93901: _execute() done 25039 1726867473.93903: dumping result to json 25039 1726867473.93905: done dumping result, returning 25039 1726867473.93907: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0affcac9-a3a5-3ddc-7272-00000000075f] 25039 1726867473.93909: sending task result for task 0affcac9-a3a5-3ddc-7272-00000000075f 25039 1726867473.93982: done sending task result for task 0affcac9-a3a5-3ddc-7272-00000000075f 25039 1726867473.93986: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.087124", "end": "2024-09-20 17:24:33.883736", "rc": 0, "start": "2024-09-20 17:24:33.796612" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6693 0 --:--:-- --:--:-- --:--:-- 6777 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 15314 0 --:--:-- --:--:-- --:--:-- 16166 25039 1726867473.94063: no more pending results, returning what we have 25039 1726867473.94068: results queue empty 25039 1726867473.94069: checking for any_errors_fatal 25039 1726867473.94086: done checking for any_errors_fatal 25039 1726867473.94087: checking for max_fail_percentage 25039 1726867473.94089: done checking for max_fail_percentage 25039 1726867473.94090: checking to see if all hosts have failed and the running result is not ok 25039 1726867473.94091: done checking to see if all hosts have failed 25039 1726867473.94091: getting the remaining hosts for this loop 25039 1726867473.94093: done getting the remaining hosts for this loop 25039 1726867473.94098: getting the next task for host managed_node1 25039 1726867473.94108: done getting next task for host managed_node1 25039 1726867473.94110: ^ task is: TASK: meta (flush_handlers) 25039 1726867473.94112: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867473.94121: getting variables 25039 1726867473.94124: in VariableManager get_vars() 25039 1726867473.94166: Calling all_inventory to load vars for managed_node1 25039 1726867473.94169: Calling groups_inventory to load vars for managed_node1 25039 1726867473.94171: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867473.94195: Calling all_plugins_play to load vars for managed_node1 25039 1726867473.94199: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867473.94202: Calling groups_plugins_play to load vars for managed_node1 25039 1726867473.96535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867473.98308: done with get_vars() 25039 1726867473.98338: done getting variables 25039 1726867473.98409: in VariableManager get_vars() 25039 1726867473.98424: Calling all_inventory to load vars for managed_node1 25039 1726867473.98426: Calling groups_inventory to load vars for managed_node1 25039 1726867473.98435: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867473.98440: Calling all_plugins_play to load vars for managed_node1 25039 1726867473.98443: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867473.98445: Calling groups_plugins_play to load vars for managed_node1 25039 1726867474.01320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867474.02930: done with get_vars() 25039 1726867474.02951: done queuing things up, now waiting for results queue to drain 25039 1726867474.02953: results queue empty 25039 1726867474.02953: checking for any_errors_fatal 25039 1726867474.02956: done checking for any_errors_fatal 25039 1726867474.02957: checking for max_fail_percentage 25039 1726867474.02958: done checking for max_fail_percentage 25039 1726867474.02959: checking to see if all hosts have failed and the running result is not ok 25039 1726867474.02959: done checking to see if all hosts have failed 25039 1726867474.02960: getting the remaining hosts for this loop 25039 1726867474.02961: done getting the remaining hosts for this loop 25039 1726867474.02963: getting the next task for host managed_node1 25039 1726867474.02965: done getting next task for host managed_node1 25039 1726867474.02966: ^ task is: TASK: meta (flush_handlers) 25039 1726867474.02967: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867474.02969: getting variables 25039 1726867474.02970: in VariableManager get_vars() 25039 1726867474.02981: Calling all_inventory to load vars for managed_node1 25039 1726867474.02983: Calling groups_inventory to load vars for managed_node1 25039 1726867474.02984: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867474.02988: Calling all_plugins_play to load vars for managed_node1 25039 1726867474.02989: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867474.02991: Calling groups_plugins_play to load vars for managed_node1 25039 1726867474.03910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867474.05555: done with get_vars() 25039 1726867474.05568: done getting variables 25039 1726867474.05604: in VariableManager get_vars() 25039 1726867474.05615: Calling all_inventory to load vars for managed_node1 25039 1726867474.05617: Calling groups_inventory to load vars for managed_node1 25039 1726867474.05618: Calling all_plugins_inventory to load vars for managed_node1 25039 1726867474.05622: Calling all_plugins_play to load vars for managed_node1 25039 1726867474.05623: Calling groups_plugins_inventory to load vars for managed_node1 25039 1726867474.05625: Calling groups_plugins_play to load vars for managed_node1 25039 1726867474.06247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25039 1726867474.07106: done with get_vars() 25039 1726867474.07123: done queuing things up, now waiting for results queue to drain 25039 1726867474.07125: results queue empty 25039 1726867474.07126: checking for any_errors_fatal 25039 1726867474.07126: done checking for any_errors_fatal 25039 1726867474.07127: checking for max_fail_percentage 25039 1726867474.07127: done checking for max_fail_percentage 25039 1726867474.07128: checking to see if all hosts have failed and the running result is not ok 25039 1726867474.07128: done checking to see if all hosts have failed 25039 1726867474.07129: getting the remaining hosts for this loop 25039 1726867474.07130: done getting the remaining hosts for this loop 25039 1726867474.07132: getting the next task for host managed_node1 25039 1726867474.07134: done getting next task for host managed_node1 25039 1726867474.07134: ^ task is: None 25039 1726867474.07135: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25039 1726867474.07136: done queuing things up, now waiting for results queue to drain 25039 1726867474.07137: results queue empty 25039 1726867474.07137: checking for any_errors_fatal 25039 1726867474.07138: done checking for any_errors_fatal 25039 1726867474.07138: checking for max_fail_percentage 25039 1726867474.07138: done checking for max_fail_percentage 25039 1726867474.07139: checking to see if all hosts have failed and the running result is not ok 25039 1726867474.07139: done checking to see if all hosts have failed 25039 1726867474.07141: getting the next task for host managed_node1 25039 1726867474.07142: done getting next task for host managed_node1 25039 1726867474.07143: ^ task is: None 25039 1726867474.07143: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=75 changed=2 unreachable=0 failed=0 skipped=63 rescued=0 ignored=0 Friday 20 September 2024 17:24:34 -0400 (0:00:00.623) 0:00:31.597 ****** =============================================================================== fedora.linux_system_roles.network : Configure networking connection profiles --- 2.69s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which services are running ---- 2.01s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.79s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.76s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.06s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface veth0 --------------------------------------------- 1.01s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.82s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.81s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.81s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Install iproute --------------------------------------------------------- 0.74s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 0.73s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Ensure ping6 command is present ----------------------------------------- 0.70s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.70s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install iproute --------------------------------------------------------- 0.65s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.63s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Verify DNS and network connectivity ------------------------------------- 0.62s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Get stat for interface veth0 -------------------------------------------- 0.57s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Gather current interface info ------------------------------------------- 0.53s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Test gateway can be pinged ---------------------------------------------- 0.48s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 25039 1726867474.07235: RUNNING CLEANUP